Posted by Greg Hartrell, Product Director, Games on Play/Android
With over three billion players showing strong engagement worldwide, the games market continues to remain resilient and grow beyond expectations. As we look ahead this year, the influx of new and returning players creates a great opportunity for developers to scale their games businesses.
The Google for Games Developer Summit returns virtually on March 15, 2022 at 9AM Pacific. From mobile to cloud, learn about our new solutions for game developers that make it easier to build high-quality games and reach audiences around the world.
Join us for the keynote at 9AM Pacific followed by over 20 developer sessions on-demand. We’ll share deep-dives and updates on the Android Game Development Kit, Google Play Games beta on PC, Play Asset Delivery, Play Console, and more. The summit is open for all. Check out the full agenda today at g.co/gamedevsummit.
Posted by Márton Braun, Developer Relations Engineer
Synthetic properties to access views were created as a way to eliminate the common boilerplate of findViewById calls. These synthetics are provided by JetBrains in the Kotlin Android Extensions Gradle plugin (not to be confused with Android KTX).
findViewById
In November 2020, we announced that this plugin has been deprecated in favor of better solutions, and we recommended removing the plugin from your projects. We know many developers still depend on this plugin’s features, and we’ve extended the support timeframe so you have more time to complete your migrations.
We are now setting a deadline for these migrations: the plugin will be removed in Kotlin 1.8, which is expected to be released by the end of 2022. At that time, you won’t be able to update your project to newer Kotlin versions if it still depends on the Kotlin Android Extensions plugin. This means that now is the time to perform the necessary migrations in your projects.
Instead of synthetics, we recommend using View Binding, which generates type-safe binding classes from XML layout files. These bindings provide convenient access to view references and they work safely for layouts with multiple configurations. See the migration guide for detailed instructions on how to adopt View Binding. If you encounter any issues, you can report a bug on the Issue Tracker.
When building new features, consider using Jetpack Compose, Android's modern UI toolkit. Layouts built with Compose are declarative Kotlin code, eliminating the need to work with view references.
Another feature included in the plugin is Parcelize, which helps you create parcelable classes. Parcelize is now available in the standalone kotlin-parcelize plugin with unchanged functionality. To get up and running with the new plugin, check out the Parcelize documentation page.
kotlin-parcelize
If you’re still using the Kotlin Android Extensions Gradle plugin, kick off your migration in time so that you can keep upgrading to new Kotlin releases in the future. This will enable you to use the latest language features and take advantage of tooling and compiler improvements.
Posted by Jose Alcérreca, Android Developer Relations Engineer
As apps increase in functionality and complexity, manually testing them to verify behavior becomes tedious, expensive, or impossible. Modern apps, even simple ones, require you to verify an ever-growing list of test points such as UI flows, localization, or database migrations. Having a QA team whose job is to manually verify that the app works is an option, but fixing bugs at that stage is expensive. The sooner you fix a problem in the development process the better.
Automating tests is the best approach to catching bugs early. Automated testing (from now on, testing) is a broad domain and Android offers many tools and libraries that can overlap. For this reason, beginners often find testing challenging.
In response to this feedback, and to accommodate for Compose and new architecture guidelines, we revamped two testing sections on d.android.com:
Firstly, there is the new Testing training, which includes the fundamentals of testing in Android with two new articles: What to test, an opinionated guide for beginners, and a detailed guide on Test doubles.
Faking dependencies in unit tests
After providing an overview of the theory, the guide focuses on practical examples of the two main types of tests.
Faking dependencies in UI tests
Secondly, we updated the Testing section of the Tools documentation that focuses on all the tools that help you create and run tests, from Android Studio to testing from the command line.
The Unified Gradle test runner.
We included an article that describes Advanced test setup features such as working with different variants, the instrumentation manifest options, or the Android Gradle Plugin settings.
These two new sections should give you a general notion of how and where to test your Android app. To learn more about testing specific features and libraries, you should check out their respective documentation pages. For example: Testing Kotlin flows, Test Navigation, or the Hilt testing guide.
Sadly, machines can't automatically verify the correctness of our documentation, so if you find errors or have suggestions, please file a bug on our documentation issue tracker.
Posted by Dave Burke, VP of Engineering
Every day, billions of people around the world pull out their Android device to help them get things done. That Android works well for each and every one of them is ensured in part through a collaborative process with you, our developer community, sharing feedback to help us make Android stronger.
Today, we’re sharing a first look at the next release of Android, with the Android 13 Developer Preview 1. With Android 13 we’re continuing some important themes: privacy and security, as well as developer productivity. We’ll also build on some of the newer updates we made in 12L to help you take advantage of the 250+ million large screen Android devices currently running.
This is just the start for Android 13, and we’ll have lots more to share as we move through the release. Read on for a taste of what’s new, and visit the Android 13 developer site for details on downloads for Pixel and the release timeline. As always, it’s crucial to get your feedback early, to help us include it in the final release. We’re looking forward to hearing what you think, and thanks in advance for your continued help in making Android a platform that works for everyone!
People want an OS and apps that they can trust with their most personal and sensitive information. Privacy is core to Android’s product principles, and Android 13 focuses on building a responsible and high quality platform for all by providing a safer environment on the device and more controls to the user. In today’s release, we’re introducing a photo picker that allows users to share photos and videos securely with apps, and a new Wi-Fi permission to further minimize the need for apps to have the location permission. We recommend trying out the new APIs and testing how the changes may affect your app.
Photo picker and APIs - To help protect photo and video privacy of users, Android 13 adds a system photo picker — a standard and optimized way for users to share both local and cloud-based photos securely. Android’s long standing document picker allows a user to share specific documents of any type with an app, without that app needing permission to view all media files on the device. The photo picker extends this capability with a dedicated experience for picking photos and videos. Apps can use the photo picker APIs to access the shared photos and videos without needing permission to view all media files on the device. We plan to bring the photo picker experience to more Android users through Google Play system updates, as part of a MediaProvider module update for devices (excepting Go devices) running Android 11 and higher. Give photo picker APIs a try and let us know your feedback!
Photo picker provides a consistent, secure way for users to give apps access to specific photos and videos.
Nearby device permission for Wi-Fi - Android 13 introduces the NEARBY_WIFI_DEVICES runtime permission (part of the NEARBY_DEVICES permission group) for apps that manage a device's connections to nearby access points over Wi-Fi. The new permission will be required for apps that call many commonly-used Wi-Fi APIs, and enables apps to discover and connect to nearby devices over Wi-Fi without needing location permission. Previously, the location permission requirements were a challenge for apps that needed to connect to nearby Wi-Fi devices but didn’t actually need the device location. Apps targeting Android 13 will be now able to request the NEARBY_WIFI_DEVICES permission with the “neverForLocation” flag instead, which should help promote a privacy-friendly app design while reducing friction for developers. Learn more.
Android 13 also brings new features and tools for developer productivity. Helping you create beautiful apps that run on billions of devices is one of our core missions – whether it’s in Android 13 or through our tools for modern Android development, like a language you love in Kotlin or opinionated APIs with Jetpack. By helping you work more productively, we aim to lower your cost of development so you can focus on continuing to build amazing experiences. Here’s some of what’s new in today’s release.
Quick Settings Placement API - Quick Settings in the notification shade is a convenient way for users to change settings or take quick actions without leaving the context of an app. For apps that provide custom tiles, we’re making it easier for users to discover and add your tiles to Quick Settings. Using a new tile placement API, your app can now prompt the user to directly add your custom tile to the set of active Quick Settings tiles. A new system dialog lets the user add the tile in one step, without leaving your app, rather than having to go to Quick Settings to add the tile.
Themed app icons - In Android 13 we’re extending Material You dynamic color beyond Google apps to all app icons, letting users opt into icons that inherit the tint of their wallpaper and other theme preferences. All your app needs to supply is a monochromatic app icon (for example, your notification drawable) and a tweak to the adaptive icon XML. We’re encouraging all developers to provide compatible icons to help provide a consistent experience for users who have opted in. Themed app icons are initially supported on Pixel devices and we’re working with our device manufacturer partners to bring them to more devices. Learn more.
Per-app language preferences - Some apps let users choose a language that differs from the system language, to meet the needs of multilingual users. Such apps can now call a new platform API to set or get the user’s preferred language, helping to reduce boilerplate code and improve compatibility when setting the app’s runtime language. For broader compatibility, we'll be adding a similar API in an upcoming Jetpack library. Learn more.
Faster hyphenation - Hyphenation makes wrapped text easier to read and helps make your UI more adaptive. In Android 13 we’ve optimized hyphenation performance by as much as 200% so you can now enable it in your TextViews with almost no impact on rendering performance. To enable faster hyphenation, use the new fullFast or normalFast frequencies in setHyphenationFrequency(). Give faster hyphenation a try and let us know what you think!
fullFast
normalFast
setHyphenationFrequency()
AGSL animated shader, adapted from this GLSL Shader
OpenJDK 11 updates - In Android 13 we’ve started the work of refreshing Android's Core Libraries to align with the OpenJDK 11 LTS release, with both library updates and Java 11 programming language support for app and platform developers. We also plan to bring these Core Library changes to more devices through Google Play system updates, as part of an ART module update for devices running Android 12 and higher. Learn more.
With each platform release we’re working to make updates faster and smoother by prioritizing app compatibility as we roll out new platform versions. In Android 13 we’ve made most app-facing changes opt-in to give you more time, and we’ve updated our tools and processes to help you get ready sooner.
More of Android updated through Google Play - In Android 13 we’re continuing to expand our investment in Google Play system updates (Project Mainline) to give apps a more consistent, secure environment across devices, and to deliver new features and capabilities to users. We can now push new features like photo picker and OpenJDK 11 directly to users on older versions of Android through updates to existing modules. We’ve also added new modules, such as the Bluetooth and Ultra wideband modules, to further expand the scope of Android’s updatable core functionality.
Optimizing for tablets, foldables, and Chromebooks - With all the momentum in large screen devices like tablets, foldables, and Chromebooks, now is the time to get your apps ready for these devices and design fully adaptive apps that fit any screen. You can get started using our guidance on optimizing for tablets, then learn how to build for large screens and develop for foldables.
Easier testing and debugging of changes - To make it easier for you to test the opt-in changes that can affect your app, we’ll make many of them toggleable again this year. WIth the toggles you can force-enable or disable the changes individually from Developer options or adb. Check out the details here.
adb
App compatibility toggles in Developer Options.
Platform stability milestone - Like last year, we’re letting you know our Platform Stability milestone well in advance, to give you more time to plan for app compatibility work. At this milestone we’ll deliver not only final SDK/NDK APIs, but also final internal APIs and app-facing system behaviors. This year we’re expecting to reach Platform Stability in June 2022, and from that time you’ll have several weeks before the official release to do your final testing. The release timeline details are here.
The Developer Preview has everything you need to try the Android 13 features, test your apps, and give us feedback. For testing your app with tablets and foldables, the easiest way to get started is using the Android Emulator in a tablet or foldable configuration - complete setup instructions are here. For phones, you can get started on a device today by flashing a system image to a Pixel 6 Pro, Pixel 6, Pixel 5a 5G, Pixel 5, Pixel 4a (5G), Pixel 4a, Pixel 4 XL, or Pixel 4 device. If you don’t have a Pixel device, you can use the 64-bit system images with the Android Emulator in Android Studio. For even broader testing, GSI images are available.
When you’re set up, here are some of the things you should do:
We’ll update the preview system images and SDK regularly throughout the Android 13 release cycle. This initial preview release is for developers only and not intended for daily or consumer use, so we're making it available by manual download only. Once you’ve manually installed a preview build, you’ll automatically get future updates over-the-air for all later previews and Betas. More here.
As we reach our Beta releases, we'll be inviting consumers to try Android 13 as well, and we'll open up enrollments for the Android Beta program at that time. For now, please note that Android Beta is not yet available for Android 13.
For complete information, visit the Android 13 developer site.
Java and OpenJDK are trademarks or registered trademarks of Oracle and/or its affiliates.
Posted by Rohan Shah, Product Manager on Android
We’re excited to announce that Material You, specifically dynamic color, will soon be available on more Android 12 phones globally, including devices by Samsung, OnePlus, Oppo, Vivo, realme, Xiaomi, Tecno, and more!
With the release of Android 12 and the introduction of Material You, we made the Android experience more fluid and personal than ever for our users. The gorgeous new design brought to life experiences such as a more dynamic touch ripple, a silky-smooth scroll, and a spacious layout. But the star of the show was, and continues to be, dynamic color – pick your favorite wallpaper and the entire phone experience transforms to better express you, from your home screen to some of your favorite apps.
With Material You, personalization is now a defining trait of Android that our ecosystem will continue building on for years to come. We want to make sure that you, our developers, have the confidence to join us on the journey and bring a more personal look and feel to users through your apps.
A Gmail rainbow with different wallpaper-based themes, shown on some of the Android device experiences that will support Material You
As more Android 12 devices land in the next couple months, our OEM partners are working with us to ensure that key design APIs, especially around dynamic color, work consistently across the Android ecosystem so developers can have peace of mind and users can benefit from a cohesive experience.
To better help you understand how to implement dynamic color and fit it into your overall brand story, the Material team has published the comprehensive Customizing Material article with codelabs and guides to get started with Views or Jetpack Compose. Watch for ongoing updates to Material Theme Builder and Material Color Utilities in the coming months to provide you with the tools you need for design and implementation.
Visualize dynamic color in your app with the Material Theme Builder
Google apps (Gmail, Photos, Chrome, and many more) have used the very same tools and guidance to bring the color story to life on their branded experiences, and we’re excited for you to hop on board as well. As you learn more about how color can harmonize with user choice and work with dynamic color in your app, we’d love to get your feedback via the Material Android issue tracker. Happy coloring!
Posted by Florina Muntenescu, Android Developer Relations Engineer
Today, we’re releasing version 1.1 of Jetpack Compose, Android's modern, native UI toolkit, continuing to build out our roadmap. This release contains new features like improved focus handling, touch target sizing, ImageVector caching, and support for Android 12 stretch overscroll. Compose 1.1 also graduates a number of previously experimental APIs to stable and supports newer versions of Kotlin. We've already updated our samples, codelabs, and Accompanist library to work with Compose 1.1.
ImageVector
Compose 1.1 introduces image vector caching bringing big performance improvements. We’ve added a caching mechanism to painterResource API to cache all instances of ImageVectors that are parsed with a given resource id and theme. The cache will be invalidated on configuration changes.
painterResource
With respect to Compose 1.0, Material components will expand their layout space to meet Material accessibility guidelines touch target size. For instance, a RadioButton's touch target will expand to a minimum size of 48x48dp, even if you set the RadioButton's size to be smaller. This aligns Compose Material to the same behavior of Material Design Components, providing consistent behavior if you mix Views and Compose. This change also ensures that when you create your UI using Compose Material components, minimum requirements for touch target accessibility will be met.
RadioButton's
RadioButton
If you find this change breaks existing layout logic, set LocalMinimumTouchTargetEnforcement to false to disable this behavior, but please be mindful this might reduce the usability of your app, and should be used with caution.
LocalMinimumTouchTargetEnforcement
false
RadioButton touch target update Left: Compose 1.0, right: Compose 1.1
Several APIs graduated from experimental to stable. Highlights include:
EnterTransition
ExitTransition
AnimatedVisibility
rememberVectorPainter
VectorProperty
VectorConfig
RenderVectorGroup
We’re continuing to bring new features to Compose. Here are a few highlights:
AnimatedContent
rememberSaveable
LazyColumn/LazyRow
Modifier.animateItemPlacement()
BringIntoView
Try out the new APIs using @OptIn and give us feedback!
@OptIn
Note: Using Compose 1.1 requires using Kotlin 1.6.10. Check out the Compose to Kotlin Compatibility Map for more information.
Wondering what’s next? Check out our updated roadmap to see the features we’re currently thinking about and working on, such as lazy item animations, downloadable fonts, moveable content, and more!
Jetpack Compose is stable, ready for production, and continues to add the features you’ve been asking us for. We’ve been thrilled to see tens of thousands of apps start using Jetpack Compose in production already and we can’t wait to see what you’ll build!
We’re grateful for all of the bug reports and feature requests submitted to our issue tracker over the Alphas and Betas - they help us to improve Compose and build the APIs you need. Do continue providing your feedback and help us make Compose better!
Happy composing!
Posted by The Android Team
Google Chrome is the most widely used browser globally, and the Chrome team wants to ensure their users have a great experience across all devices. Many Chrome users have been requesting more productivity features on their mobile, tablet, and foldable devices to better match the capabilities of Chrome on desktop. To meet these needs, the team decided to invest in building features that encourage multitasking capabilities. While the team built this for phones as well, they wanted to especially focus on implementing these features where people would use them the most: large screen devices such as tablets and foldables.
The team first decided to focus on building a way for people to open multiple Chrome windows (instances) side by side. They took advantage of 12L features such as the taskbar and also took advantage of the Samsung edge panel.
They utilized the singleInstancePerTask launch mode to build the side-by-side functionality. They wanted to balance allowing people to use many windows at once with making sure the feature was still usable. The team researched usability best practices, observed other multi-window experiences on large screen devices, and thought through limitations to ensure optimal device memory usage. They determined people could comfortably use up to five windows side by side on large screen devices, and the team updated their app to support this functionality.
The team wanted to make it easier for their users to take advantage of this feature, so they added a “New Window” shortcut in the menu. They used the new capability of intent flag combo LAUNCH_ADJACENT|NEW_TASK to create this shortcut. Having this feature be more prominently displayed in the product greatly improved the usage. They saw multi-window usage improve by 18x.
This is a new feature, and the Chrome team has already seen that multi-instance for the Chrome app is used 42% more on tablets and foldables than on phones that support the feature. This usage demonstrates the functionality resonated well with Chrome users on large screen devices, and that it was worth investing in building these features to enhance the experience for Chrome users on large screens.
They also had very positive feedback from their large screen users in the form of app reviews. “This app is fabulous 👌! You can split screen, change tabs, and much more. You can also play a lot of games in it. I prefer to five star this app.”
The team has future plans to further improve the Chrome experience on large screens to help their users be more productive.
Learn more about how you can get started with optimizing your app for large screens.
Posted by the Smule Engineering team: David Gayle, Chris Manchester, Mark Gills, Trayko Traykov, Randal Leistikow, Mariya Ivanova.
As the most downloaded singing app of all time, Smule Inc. has been investing on Android to improve the overall audio quality and, more specifically, to reduce latency, i.e. allowing singers to hear their voices in the headset as they perform. The teams specialized in Audio and Video allocated a significant part of 2021 into making the necessary changes to convert the Smule application used by over ten million Android users from using the OpenSL audio API to the Oboe audio library, enabling roughly a 10%+ increase in recording completion rate.
Smule Inc. is a leader in karaoke, with an app that helps millions of people sing their favorite songs and share performances daily. The Smule application goes beyond traditional karaoke by focusing on co-creation, offering users the unique opportunity to share music and collaborate with friends, other singers on the platform, and their favorite music artists. Audio quality is paramount, and, in 2020, the Smule team saw potential to enhance the experience on Android.
Smule’s legacy OpenSL implementation wasn’t well-suited to leverage the blazing-fast hardware of new devices while supporting the diverse devices across its world-wide market. Smule’s development team determined that upgrading the audio system was a necessary and a logical improvement.
Smule was faced with two possible routes for improvement: directly targeting AAudio, a high-performance Android C audio API introduced in Android O designed for applications that require low latency, or Oboe, which wraps both AAudio and OpenSL internally. After careful evaluation, Smule’s development team opted for Oboe’s easy-to-use code base, and broad device compatibility, and robust community support, which achieved the lowest latency and made the best use of the available native audio.
The conversion to Oboe represented a significant architectural and technological evolution. As a result, Smule approached the rollout process conservatively, with a planned, gradual release that started with a small selection of device models to d validate quality. Week after week, the team enabled more devices (reverting a limited number of devices exhibiting problems in Oboe back to OpenSL). This incremental, methodical approach helped to minimize risk and allowed the engineering team to handle device-specific issues as they occurred.
Smule switched to Oboe to help improve the app experience. They hoped to reduce dramatically audio playback crashes, eliminate issues such as echo and crackling during recording, and reduce audio latency. A recent article in the Android Developers blog shows that the average latency of the twenty most popular devices decreased from 109 ms in 2017 to 39ms today using Oboe. Whereas a monitoring delay of 109ms is heard as a distinct echo which interferes with live singing, 39ms is beneath the acceptable threshold for real-time applications. The latencies of top devices today are all within 22ms of one another, and this consistency is a big plus.
The lift in recording completion rate Smule has seen using Oboe is likely due to this lower latency, allowing singers to hear their voices in the headset as they perform with Smule’s world-class audio effects applied, but without an echo.
Using an effective collaborative GitHub portal dedicated to Oboe, the Google team played a significant role in Smule’s Oboe integration, providing them with key insights and support. Working together, the two teams were able to launch the largest Oboe deployment to date, reaching millions of active users. The Smule team contributed to addressing some Oboe code issues, and the Google team coordinated with certain mobile device makers to further improve Oboe's compatibility.
Audio quality is of the utmost importance to our community of singers, and we're thankful for our shared commitment to delivering the best possible experience as well as empowering musical creation on Smule. - Eric Dumas, Smule CTO.
Given the massive scale of the operation, it was only natural to face device-specific issues. One notable example was an OS built-in functionality that injected echo sound effects in the raw audio stream, which prevented Smule from correctly applying its own patented DSP algorithms and audio filters. Google’s team came to the rescue, providing lightning fast updates and patches to the library. The process of reporting Oboe issues was straightforward, well defined, and handled in a timely manner by the Google team.
Smule overcame other device-specific roadblocks together, including errors with specific chipsets. As an example, when Oboe was asking for mono microphone input, a few devices provided stereo inputs mixed into one fake mono microphone input. Smule created a ticket in Oboe’s GitHub, providing examples and reproducing the issue using the Oboe tester app.
The Google-developed Oboe tester app was a helpful tool in solving and identifying issues throughout the implementation. It proved especially useful in testing many of the features of Oboe, AAudio, and OpenSL ES, as well as testing Android devices, measuring latency and glitches, and much more. The application offers a myriad of features that can help to simulate almost any audio setup. The Oboe tester can also be used in automation testing, by launching it from a shell script using an Android Intent. Smule relied heavily on the automation testing, given the large number of devices covered in the integration.
Once Smule was confident the device-specific issues were resolved and the Oboe audio was stable enough, Smule switched to a wider split testing rollout approach. In just a few weeks, Smule increased the population using Oboe from 10% to 100% percent of the successful devices, which was only possible due to the positive feedback and green KPI metrics Oboe received continuously throughout the release journey.
The results speak for themselves. Smule users on Oboe are singing more - it’s as simple as that. Unique karaoke recordings and performance joins, or duets, increased by a whopping 8.07%, Unique uploads by 3.84%, and song performances were completed by 4.10% more. Smule has observed in Q3 and Q4 2021 an increase of the recording completion rate by 10%+.
Using the Firebase Crashlytics tool by Google, Smule has seen a decline in audio-related crashes since the full Oboe release, making the app more stable - even on lower-end devices. Smule’s dedicated customer support team was thrilled to report a 33% reduction in audio-related complaints, including issues like (unintended) robot voice and echo.
The decision to switch to Oboe has paid off in spades. The application is functioning better than ever and Smule is well-equipped to face further advancements in audio and hardware with the updated technology. Most importantly, Smule users are happy and making music, which is what it’s all about.
Posted by Phalene Gowling, Product Manager, Google Play
Last year, mobile game consumer spending grew 7.3% to $93.2 billion with no signs of slowing down. In this competitive, growing market, effectively monetizing your audience has never been more important. But without access to a strategy consultant, how can you know if your monetization strategy is as strong as it can be?
That’s why we’re expanding the suite of tools available in Play Console to help it be exactly that. Last year, we released new engagement and monetization metrics on the Statistics page to help you grow your business, and now we’re pleased to announce new strategic guidance tools to help you drive successful monetization.
In this new section, you’ll see our metric-driven guidance to help you better monetize your game by:
The strategic guidance metric hierarchy. (Learn more or visit our Play Academy for specific courses like monitoring KPIs.)
We’ve spent the last couple of years perfecting our guidance, and testing the dashboard with selected partners. Feedback on our strategic guidance has been positive — and we hope you’ll find it useful, too.
“This is extremely useful! These type of insights are actually what we expect from Google, because this is something that really can help us to scale our business.”
- Product Manager at Gameloft
Strategic guidance can be found in Financial reports within Play Console. In partnership with experts in mobile games growth, we’ve included primary monetization metrics (including new metrics) and their relationships to help you easily assess your performance and measure against your peers. You can see all the metrics in this Help Center article.
The metric hierarchy is a tool to help you understand how you and your teams can directly influence the lower-level metrics of your games performance, like buyer conversions, which contribute to your overall top-line business performance. Using peerset comparisons and per-country breakdowns, you can quickly identify your biggest growth opportunities: what markets are underperforming and where you are a market leader.
Select a metric and explore it in detail to track your performance over time. Strategic guidance shows you a breakdown of your chosen metric by location to help you spot opportunities to expand your game globally. The detailed metric analysis also helps you identify where a small investment has an outsized return.
Strategic guidance metric recommendation example for returning daily buyer ratio.
Whether you’ve created a casual game or an RPG, the metric-specific recommendations are designed to be insightful and relevant to a variety of game developers. They can be used to help you diversify your promotional content, refine your game mechanics, or test new price points that enable purchasing power parity.
With an increasing number of developers shifting focus from an ads-only monetization business model to include in-app purchases (IAP), we’ve developed strategic guidance to be most relevant for developers that include IAP-monetization as part of their overall strategy. With this launch, we’re excited to bring growth consulting opportunities to these game developers at scale. Stay tuned for more launches this year to help you successfully drive your revenue growth.