20 Nisan 2023
In Android 13 (API level 33), we introduced a new standardized platform architecture for spatial audio, a premium and more engaging sound experience. With spatial audio, your content sounds more realistic to users by making it sound as though they are in the middle of the action. The individual instruments from a band can be separated and “placed” around the user, or the sound from a whale might grow as it approaches from behind and taper off as it swims away. Read on to learn more about Android’s support for spatial audio and how to implement the feature in your app.
There are two main distinctions of spatial audio:
On Android, only multi-channel audio configured with the right AudioAttributes and AudioFormat is spatialized by default, though OEMs can customize this behavior. On devices where the OEM has integrated a spatializer effect, static spatial audio will work when any headset is connected to the device, though head-tracked spatial audio requires a headset with compatible head tracking sensors. OEMs like Pixel, OnePlus, and Xiaomi have already made these experiences available to their users.
The easiest way to integrate with this feature is to use ExoPlayer! If you use ExoPlayer from Jetpack Media3 release 1.0.0 or newer, ExoPlayer will configure the decoder to prevent multi-channel audio from being downmixed to stereo and the default track selection behavior will take into account whether or not spatialization is possible. This means your content just needs to include a multi-channel audio track that ExoPlayer can select. ExoPlayer will monitor the device’s state and select a multi-channel track when spatialization is possible, or switch to a stereo track if not.
Android 12L (API level 32) added the new Spatializer class to allow you to query the spatialization capabilities of the device. There are four conditions that must all be true for the device to output spatialized audio:
// Get an instance of the Spatializer from the AudioManager
val audioManager = getSystemService(Context.AUDIO_SERVICE) as AudioManager
val spatializer = audioManager.spatializer
if (
// Does the device have a spatializer effect?
spatializer.immersiveAudioLevel != Spatializer.SPATIALIZER_IMMERSIVE_LEVEL_NONE
// Is spatial audio enabled in the settings?
&& spatializer.isEnabled
// Is spatialization available, for example for the current audio output routing?
&& spatializer.isAvailable
// Can audio with the given parameters be spatialized?
&& spatializer.canBeSpatialized(audioAttributes, audioFormat)
) {
// Spatialization is possible, so you can select a multi-channel track for playback with
// spatial audio.
} else {
// Spatialization is not possible, so you may choose to select a stereo track for playback
// to preserve bandwidth.
}
ExoPlayer performs these checks when deciding which audio track to select. To further check if head tracking is available, you can call the isHeadTrackerAvailable() method. The Spatializer class also includes the following listeners to be able to react to changes in the device’s state:
OnSpatializerStateChangedListener | For changes in whether the spatializer is enabled or available. |
OnHeadTrackerAvailableListener | For changes in whether head tracking is available. |
With these signals, you can manually adjust your playback for spatial audio. Note that if you are not using ExoPlayer, you should make sure to configure the decoder to output multi-channel audio when possible by setting the max channel count to a large number with MediaFormat.setInteger(MediaFormat.KEY_MAX_OUTPUT_CHANNEL_COUNT, ##)
. See how ExoPlayer does this on GitHub. There are two ways to prevent spatialization depending on your use-case. If your audio is already spatialized, call setIsContentSpatialized(true) when configuring the AudioAttributes for your audio stream to prevent the audio from being double-processed. In all other cases, you can instead call setSpatializationBehavior(AudioAttributes.SPATIALIZATION_BEHAVIOR_NEVER) to disable spatialization altogether.
As mentioned previously, using spatial audio requires a supported device (that is, getImmersiveAudioLevel() does not return SPATIALIZER_IMMERSIVE_LEVEL_NONE) and a connected headset. To test spatial audio, start by making sure the feature is enabled in settings:
Note that for spatial audio with head tracking, the headset must have head tracking sensors that are compatible with the device, such as Pixel Buds Pro with a Pixel phone, and head tracking must also be enabled in settings.
Hearing is believing, so we highly recommend trying out spatial audio for yourself! You can see an example implementation in our sample app, Universal Android Music Player. And for more details on everything discussed here, check out our spatial audio developer guide.