22 September 2025
Welcome to the second installment of our three-part series on media preloading with Media3. This series is designed to guide you through the process of building highly responsive, low-latency media experiences in your Android apps.
If you are new to preloading in Media3, we highly recommend reading Part 1 before proceeding. For those ready to move beyond the basics, let's explore how to elevate your media playback implementation.
When you want to launch a feature in production, as an app developer you also want to understand and capture the analytics behind it. How can you be certain that your preloading strategy is effective in a real-world environment? Answering this requires data on success rates, failures, and performance. The PreloadManagerListener interface is the primary mechanism for gathering this data.
The PreloadManagerListener provides two essential callbacks that offer critical insights into the preloading process and status.
You can register a listener with a single method call as shown in the following example code:
val preloadManagerListener = object : PreloadManagerListener { override fun onCompleted(mediaItem: MediaItem) { // Log success for analytics. Log.d("PreloadAnalytics", "Preload completed for $mediaItem") } override fun onError( preloadError: PreloadException) { // Log the specific error for debugging and monitoring. Log.e("PreloadAnalytics", "Preload error ", preloadError) } } preloadManager.addListener(preloadManagerListener)
These listener callbacks can be hooked to your analytics pipeline. By forwarding these events to your analytics engine, you can answer key questions like:
This data could give you quantitative feedback on your preloading strategy, enabling A/B testing and data-driven improvements to your user experience. This data can further help you to intelligently finetune your preload durations and number of videos you want to preload as well as the buffers you allocate.
A failed preload is a strong indicator of an upcoming buffering event for the user. The onError callback allows you to respond reactively. Instead of merely logging the error, you can adapt the UI. For instance, if the upcoming video fails to preload, your application could disable autoplay for the next swipe, requiring a user tap to begin playback.
Additionally, by inspecting the PreloadException type you can define a more intelligent retry strategy. An app can choose to immediately remove a failing source from the manager based on the error message or HTTP status code. The item would need to be removed from the UI stream accordingly to not make loading issues leak into the user experience. You could also get more granular data from PreloadException like the HttpDataSourceException to probe further into the errors. Read more about ExoPlayer troubleshooting.
The DefaultPreloadManager and ExoPlayer are designed to work together. To ensure stability and efficiency, they must share several core components. If they operate with separate, uncoordinated components, it could impact thread safety and usability of preloaded tracks on the player since we need to ensure that preloaded tracks should be played on the correct player. The separate components could also compete for limited resources like network bandwidth and memory, which could lead to performance degradation. An important part of the lifecycle is handling appropriate disposal, the recommended order of disposal is to release the PreloadManager first, followed by the ExoPlayer.
The DefaultPreloadManager.Builder is designed to facilitate this sharing and has APIs to instantiate both your PreloadManager and a linked player instance. Let's see why components like BandwidthMeter, LoadControl, TrackSelector, Looper must be shared. Check the visual representation of how these components interact with ExoPlayer Playback.
The BandwidthMeter provides an estimate of available network bandwidth based on historical transfer rates. If the PreloadManager and the player use separate instances, they are unaware of each other's network activity, which can lead to failure scenarios. For example, consider the scenario where a user is watching a video, their network connection degrades, and the preloading MediaSource simultaneously initiates an aggressive download for a future video. The preloading MediaSource’s activity would consume bandwidth needed by the active player, causing the current video to stall. A stall during playback is a significant user experience failure.
By sharing a single BandwidthMeter, the TrackSelector is able to select tracks of highest quality given the current network conditions and the state of the buffer, during preloading or playback. It can then make intelligent decisions to protect the active playback session and ensure a smooth experience.
preloadManagerBuilder.setBandwidthMeter(customBandwidthMeter)
Note : The sharing of LoadControl in the latest version of Media3 (1.8) ensures that its Allocator can be shared correctly with PreloadManager and player. Using the LoadControl to effectively control the preloading is a feature that will be available in the upcoming Media3 1.9 release.
preloadManagerBuilder.setLoadControl(customLoadControl)
preloadManagerBuilder.setTrackSelectorFactory(customTrackSelectorFactory)
preloadManagerBuilder.setRenderersFactory(customRenderersFactory)
Read about more Exoplayer components.
The thread on which an ExoPlayer instance can be accessed can be explicitly specified by passing a Looper when creating the player. The Looper of the thread from which the player must be accessed can be queried using Player.getApplicationLooper. By maintaining a shared Looper between the player and PreloadManager, it is guaranteed that all operations on these shared media objects are serialized onto a single thread's message queue. This can reduce the concurrency bugs.
All interactions between the PreloadManager and the player with media sources to be loaded or preloaded need to happen on the same playback thread. Sharing the Looper is a must for thread safety and hence we must share the PlaybackLooper between the PreloadManager and player.
The PreloadManager prepares a stateful MediaSource object in the background. When your UI code calls player.setMediaSource(mediaSource), you are performing a handoff of this complex, stateful object from the preloading MediaSource to the player. In this scenario, the entire PreloadMediaSource is moved from the manager to the player. All these interactions and handoffs should occur on the same PlaybackLooper.
If the PreloadManager and ExoPlayer were operating on different threads, a race condition could occur. The PreloadManager’s thread could be modifying the MediaSource's internal state (e.g, writing new data into a buffer) at the exact moment the player's thread is attempting to read from it. This leads to unpredictable behavior, IllegalStateException that is difficult to debug.
preloadManagerBuilder.setPreloadLooper(playbackLooper)
Lets see how you can share all the above components between ExoPlayer and DefaultPreloadManager in the setup itself.
val preloadManagerBuilder = DefaultPreloadManager.Builder(context, targetPreloadStatusControl) // Optional - Share components between ExoPlayer and DefaultPreloadManager preloadManagerBuilder .setBandwidthMeter(customBandwidthMeter) .setLoadControl(customLoadControl) .setMediaSourceFactory(customMediaSourceFactory) .setTrackSelectorFactory(customTrackSelectorFactory) .setRenderersFactory(customRenderersFactory) .setPreloadLooper(playbackLooper) val preloadManager = val preloadManagerBuilder.build()
Tip: If you use the Default components in ExoPlayer like the DefaultLoadControl, etc, you don't need to explicitly share them with DefaultPreloadManager. When you build your ExoPlayer instance via the buildExoPlayer of the DefaultPreloadManager.Builder these components are automatically referenced with each other, if you use the default implementations with default configurations. But if you use custom components or custom configurations, you should explicitly notify the DefaultPreloadManager about them via the above APIs.
In a dynamic feed, a user can scroll through a virtually infinite amount of content. If you continuously add videos to the DefaultPreloadManager without a corresponding removal strategy, you will inevitably cause an OutOfMemoryError. Each preloaded MediaSource holds onto a SampleQueue, which allocates memory buffers. As these accumulate, they can exhaust the application's heap space. The solution is an algorithm you may already be familiar with, called the sliding window. The sliding window pattern maintains a small, manageable set of items in memory that are logically adjacent to the user's current position in the feed. As the user scrolls, this "window" of managed items slides with them, adding new items that come into view, and also removing items that are now distant.
It is essential to understand that PreloadManager does not provide a built-in setWindowSize() method. The sliding window is a design pattern that you, the developer, are responsible for implementing using the primitive add() and remove() methods. Your application logic must connect UI events, such as a scroll or page change, to these API calls. If you want a code reference for this, we have this sliding window pattern implemented in socialite sample which also includes a PreloadManagerWrapper which imitates a sliding window.
Don’t forget to add preloadManager.remove(mediaItem) in your implementation when the item is no longer likely to come up soon in the user’s viewing. Failing to remove items that are no longer proximate to the user is the primary cause of memory issues in preloading implementations. The remove() call ensures resources are released that help you keep your app's memory usage bound and stable.
Now that we have defined what to preload (the items in our window), we can apply a well defined strategy for how much to preload for each item. We already saw how to achieve this granularity with the TargetPreloadStatusControl setup in Part 1.
To recall, an item at position +/- 1 could have a higher probability of being played than an item at position +/- 4. You could allocate more resources (network, CPU, memory) to items the user is most likely to view next. This creates a "preloading" strategy based on proximity, which is the key to balancing immediate playback with efficient resource usage.
You could use analytics data via PreloadManagerListener as discussed in the earlier sections to decide your preload duration strategy.
You are now equipped with the advanced knowledge to build fast, stable, and resource-efficient media feeds using Media3's DefaultPreloadManager.
Let's recap the key takeaways:
Preloading data into memory provides an immediate performance benefit, but it can come with tradeoffs. Once the application is closed or the preloaded media is removed from the manager, the data is gone. To achieve a more persistent level of optimization, we can combine preloading with disk caching. This feature is in active development and will come soon in a few months.
Do you have any feedback to share? We are eager to hear from you.
Stay tuned, and go make your video playback faster! 🚀