Congratulations to the latest apps and games featured in the Android Excellence program on Google Play. As a reminder, these collections are refreshed every three months and recognize apps and games that set the bar for high quality, great user experience, and strong technical performance.
If you're looking for some new apps, here are a few highlights.
Here are a few of our favorite new games joining the collection.
See the full list of Android Excellence apps and games.
Dashlane
Holstelworld
iCook
Keeper Password Manager
Keepsafe Photo Vault
Mobisystems OfficeSuite
PhotoGrid
Runtastic Results
Seven - 7 Minute Workout Training Challenge
SoloLearn: Learn to Code for Free
Tube Map
WPS Office
Azur Lane アズールレーン
CodyCross
Into the Dead 2
Little Panda Restaurant
MARVEL Contest of Champions
Orbital 1
Rooms of Doom
Sky Dancer Run
Sling Kong
Soul Knight
Explore other great apps and games in the Editors' Choice section on Google Play and discover best practices to help you build quality apps and games.
How useful did you find this blogpost? ★ ★ ★ ★ ★
Android Oreo is stuffed full of security enhancements. Over the past few months, we've covered how we've improved the security of the Android platform and its applications: from making it safer to get apps, dropping insecure network protocols, providing more user control over identifiers, hardening the kernel, making Android easier to update, all the way to doubling the Android Security Rewards payouts. Now that Oreo is out the door, let's take a look at all the goodness inside.
Android already supports Verified Boot, which is designed to prevent devices from booting up with software that has been tampered with. In Android Oreo, we added a reference implementation for Verified Boot running with Project Treble, called Android Verified Boot 2.0 (AVB). AVB has a couple of cool features to make updates easier and more secure, such as a common footer format and rollback protection. Rollback protection is designed to prevent a device to boot if downgraded to an older OS version, which could be vulnerable to an exploit. To do this, the devices save the OS version using either special hardware or by having the Trusted Execution Environment (TEE) sign the data. Pixel 2 and Pixel 2 XL come with this protection and we recommend all device manufacturers add this feature to their new devices.
Oreo also includes the new OEM Lock Hardware Abstraction Layer (HAL) that gives device manufacturers more flexibility for how they protect whether a device is locked, unlocked, or unlockable. For example, the new Pixel phones use this HAL to pass commands to the bootloader. The bootloader analyzes these commands the next time the device boots and determines if changes to the locks, which are securely stored in Replay Protected Memory Block (RPMB), should happen. If your device is stolen, these safeguards are designed to prevent your device from being reset and to keep your data secure. This new HAL even supports moving the lock state to dedicated hardware.
Speaking of hardware, we've invested support in tamper-resistant hardware, such as the security module found in every Pixel 2 and Pixel 2 XL. This physical chip prevents many software and hardware attacks and is also resistant to physical penetration attacks. The security module prevents deriving the encryption key without the device's passcode and limits the rate of unlock attempts, which makes many attacks infeasible due to time restrictions.
While the new Pixel devices have the special security module, all new GMS devices shipping with Android Oreo are required to implement key attestation. This provides a mechanism for strongly attesting IDs such as hardware identifiers.
We added new features for enterprise-managed devices as well. In work profiles, encryption keys are now ejected from RAM when the profile is off or when your company's admin remotely locks the profile. This helps secure enterprise data at rest.
As part of Project Treble, the Android framework was re-architected to make updates easier and less costly for device manufacturers. This separation of platform and vendor-code was also designed to improve security. Following the principle of least privilege, these HALs run in their own sandbox and only have access to the drivers and permissions that are absolutely necessary.
Continuing with the media stack hardening in Android Nougat, most direct hardware access has been removed from the media frameworks in Oreo resulting in better isolation. Furthermore, we've enabled Control Flow Integrity (CFI) across all media components. Most vulnerabilities today are exploited by subverting the normal control flow of an application, instead changing them to perform arbitrary malicious activities with all the privileges of the exploited application. CFI is a robust security mechanism that disallows arbitrary changes to the original control flow graph of a compiled binary, making it significantly harder to perform such attacks.
In addition to these architecture changes and CFI, Android Oreo comes with a feast of other tasty platform security enhancements:
Android Instant Apps run in a restricted sandbox which limits permissions and capabilities such as reading the on-device app list or transmitting cleartext traffic. Although introduced during the Android Oreo release, Instant Apps supports devices running Android Lollipop and later.
In order to handle untrusted content more safely, we've isolated WebView by splitting the rendering engine into a separate process and running it within an isolated sandbox that restricts its resources. WebView also supports Safe Browsing to protect against potentially dangerous sites.
Lastly, we've made significant changes to device identifiers to give users more control, including:
net.hostname
Build.getSerial() API
Android Oreo brings in all of these improvements, and many more. As always, we appreciate feedback and welcome suggestions for how we can improve Android. Contact us at security@android.com.
_____________________________________________________________________
1: Glenn Wilkinson and team at Sensepost, UK, Célestin Matte, Mathieu Cunche: University of Lyon, INSA-Lyon, CITI Lab, Inria Privatics, Mathy Vanhoef, KU Leuven
Today is the beginning of KotlinConf. It's been almost 6 months since we announced Kotlin as a first-class language for Android at Google I/O. During this period, the number of apps on Google Play using Kotlin has more than doubled. More than 17% of the projects in Android Studio 3.0 are now using Kotlin. We are really excited about the strong momentum, and we are thrilled that Android developers all over the world are discovering the joy of Kotlin programming.
Kotlin for Android is production-ready. From startups to Fortune 500 companies, developers are already using Kotlin to build their apps. Developers from Pinterest, to Expedia, to Basecamp -- and many others -- are finding their use of Kotlin is increasing productivity and their overall developer happiness levels. Take a look at some of their experiences with Kotlin below.
With the recent release of Android Studio 3.0, there is now a stable version of our IDE that has Kotlin support built-in. With Support Library 27, we have started adding nullability annotations to make the APIs friendlier to use in Kotlin. We recently published the Android Kotlin Guides on GitHub to provide some guidance for Android Kotlin style and interop. We have also been porting some of our Android samples to Kotlin, and we are adding Kotlin to our official documentation.
Last week, we released Android Studio 3.0 on the stable channel. This is the first stable release of Android Studio that has Kotlin support built-in. Building on the strength of IntelliJ's Kotlin support, many critical IDE features like code completion and syntax highlighting work well for Kotlin. You can choose to convert Java code to Kotlin by using Code → Convert Java File to Kotlin File, or you can convert snippets of code just by pasting Java code into a Kotlin file.
Project and code templates have also been updated with Kotlin support. When you create a new project or add a new code file, you can choose Kotlin as one of the language options.
The tooling experience with Kotlin is by no means perfect yet. We are aware of several known issues, and we will continue to improve the IDE support for Kotlin in future releases.
There are two separate Android Kotlin Guides:
We intend these guides to be living documents and will evolve them over time. They are hosted on GitHub and we welcome your contributions.
Null-safety is an important feature of the Kotlin language. It helps developers avoid NullPointerExceptions and improves the quality of their apps. Null-safety is a bit more complicated when using Java code from Kotlin. Since any reference in Java may be null, Kotlin's requirement for strict null-safety becomes impractical for Java objects. Types declared in Java that do not contain nullability annotations are called platform types - this means the Kotlin compiler does not know whether it is nullable or not. When calling methods with variables of platform types, the Kotlin compiler relaxes null-safety checks. That means the overall null-safety of your app is weakened.
To let developers take more advantage of Kotlin's strict null-safety, we have started adding nullability annotations in Support Library 27. The Support Library contains a huge API surface area, and we will continue to expand the nullability annotation coverage in the next several releases. In addition, we will also be adding nullability annotations to other Android APIs over time.
While the Kotlin adoption growth is fantastic, our commitment to the Java and C++ programming languages remains unchanged. We've added Java 8 language features support in Android Studio 3.0, and we've added more Java 8 language APIs in Android Oreo. We are also continuing to improve our support for C++17 in the NDK. So even if you are not using Kotlin, your language support will continue to improve.
It's an exciting time to be an Android developer. If you haven't had a chance to try Kotlin, you can get started by learning the basic syntax and by playing with the excellent Kotlin Koans. When you are ready to use Kotlin in your Android app, you can jump to the Android Kotlin page for more resources. With Kotlin's Java interoperability and Android Studio's Java to Kotlin converter, it's easy to start using Kotlin in your project.
Happy Kotlin-ing!
Android developers know that dex compilation is a key step in building an APK. This is the process of transforming .class bytecode into .dex bytecode for the Android Runtime (or Dalvik, for older versions of Android). The dex compiler mostly works under the hood in your day-to-day app development, but it directly impacts your app's build time, .dex file size, and runtime performance.
That's why we are investing in making important improvements in the dex compiler. We're excited to announce that the next-generation dex compiler, D8, is now available for preview as part of Android Studio 3.0 Beta release.
When comparing with the current DX compiler, D8 compiles faster and outputs smaller .dex files, while having the same or better app runtime performance.
D8 is available for your preview starting with Android Studio 3.0 Beta. To try it, set the following in your project's gradle.properties file:
android.enableD8=true
We have tested D8's correctness and performance on a number of apps, and the results are encouraging. We're confident enough with the results that we are switching to use D8 as the default dex compiler for building AOSP.
Please give D8 a try and we would love to hear your feedback. You can file a bug report using this link.
We plan to preview D8 over the next several months with the Android Studio 3.0 release. During this time, we will focus on addressing any critical bug reports we receive from the community. We plan to bring D8 out of preview and enable it as the default dex compiler in Android Studio 3.1. At that time, the DX compiler will officially be put in maintenance mode. Only critical issues with DX will be fixed moving forward.
Beyond D8, we are also working on R8, which is a Proguard replacement for whole program minification and optimization. While the R8 project has already been open sourced, it has not yet been integrated with the Android Gradle plugin. We will provide more details about R8 in the near future when we are ready to preview it with the community.
In April, we announced Java 8 language features with desugaring. The desugaring step currently happens immediately after Java compilation (javac) and before any bytecode reading or rewriting tools are run. Over the next couple of months, the desugar step will move to a later stage in the pipeline, as part of D8. This will allow us to further reduce the overall build time and produce more optimized code. This change means that any bytecode reading or rewriting tools will run before the desugar step. If you develop .class bytecode reading or rewriting tools for Android, you will need to make sure they can handle the Java 8 bytecode format so they can continue to work properly when we move desugaring into D8.
Happy dex'ing!
Last year at I/O we launched the Awareness API, a simple yet powerful API that let developers use signals such as Location, Weather, Time and User Activity to build contextually relevant app experiences.
Available via Google Play services, the Awareness API offers two ways to take advantage of context signals within your app. The Snapshot API lets your app request information about the user's current context, while the Fence API lets your app react to changes in user's context, and when it matches a certain set of conditions. For example, "tell me whenever the user is walking and their headphone is plugged in".
Until now, you could specify a time fence on the Awareness APIs but were restricted to using absolute/canonical representation of time. Based on developer feedback, we realized that the flexibility of the API in regards to building time fences did not support higher level abstractions people use when they think and talk about time. "This weekend", "on the next holiday", "after sunset", are all very common and colloquial ways of expressing time. That's why we're adding Semantic time support to these APIs starting today
For e.g., if you were building a fitness app and wanted a way to prompt users everyday morning to start their routine, or if you're a reading app that wants to turn on night mode after dusk; you would have to query a 3p API for sunrise/sunset information at the user location and then write up an Awareness fence with those canonical time values. With our latest update, you can use our TIME_INSTANT_SUNRISE and TIME_INSTANT_SUNSET constants and let the platform manage all the complexity for you.
TIME_INSTANT_SUNRISE
TIME_INSTANT_SUNSET
Let's look at an example. Suppose you're building a fitness app which prompts users on Tuesday, and Thursday around sunrise to begin their morning work out. You can set up this triggering using the following lines of code.
// A sun-state-based fence that is TRUE only on Tuesday and Thursday during Sunrise AwarenessFence.and( TimeFence.aroundTimeInstant(TimeFence.TIME_INSTANT_SUNRISE, -10 * ONE_MINUTE_MILLIS, 5 * ONE_MINUTE_MILLIS), AwarenessFence.or( TimeFence.inIntervalOfDay(TimeFence.DAY_OF_WEEK_TUESDAY, 0, ONE_DAY_MILLIS), TimeFence.inIntervalOfDay(TimeFence.DAY_OF_WEEK_THURSDAY, 0, ONE_DAY_MILLIS)));
One of our favorite semantic time features is public holidays. Every country and regions within it have different holidays. Assume you were a local hiking & adventure app that wants to show users activities they can indulge in on a holiday that falls on a Friday or a Monday. You can use a combination of Days and Holiday flags to identify this state for all your users around the world. You can do this with just 3 lines of code and have this work in any part of the world.
// A local-time fence that is TRUE only on public holidays in the // device locale that fall on Fridays or Mondays. AwarenessFence.and( TimeFence.inTimeInterval(TimeFence.TIME_INTERVAL_HOLIDAY), AwarenessFence.or( TimeFence.inIntervalOfDay(TimeFence.DAY_OF_WEEK_FRIDAY, 9 * ONE_HOUR_MILLIS, 11 * ONE_HOUR_MILLIS), TimeFence.inIntervalOfDay(TimeFence.DAY_OF_WEEK_MONDAY, 9 * ONE_HOUR_MILLIS, 11 * ONE_HOUR_MILLIS)));
In both example cases, Awareness does the heavy lifting of localizing time and holidays based on the device locale settings.
We're excited to see what problems you'll solve using this powerful API. Please join our mailing list to get updates about this and other Context APIs at Google.
"Early Access is cool because you can ask the big questions and get real answers from real players," Joe Raeburn, Founding Product Guy at Space Ape Games.
Posted by Ian Lake, Developer Advocate
With the release of the 25.1.0 Support Library, there's a new entry in the family: the ExifInterface Support Library. With significant improvements introduced in Android 7.1 to the framework's ExifInterface, it only made sense to make those available to all API 9+ devices via the Support Library's ExifInterface.
ExifInterface
The basics are still the same: the ability to read and write Exif tags embedded within image files: now with 140 different attributes (almost 100 of them new to Android 7.1/this Support Library!) including information about the camera itself, the camera settings, orientation, and GPS coordinates.
For Camera apps, the writing is probably the most important - writing attributes is still limited to JPEG image files. Now, normally you wouldn't need to use this during the actual camera capturing itself - you'd instead be calling the Camera2 API CaptureRequest.Builder.set() with JPEG_ORIENTATION, JPEG_GPS_LOCATION or the equivalents in the Camera1 Camera.Parameters. However, using ExifInterface allows you to make changes to the file after the fact (say, removing the location information on the user's request).
CaptureRequest.Builder.set()
JPEG_ORIENTATION
JPEG_GPS_LOCATION
Camera.Parameters
For the rest of us though, reading those attributes is going to be our bread-and-butter; this is where we see the biggest improvements.
Firstly, you can read Exif data from JPEG and raw images (specifically, DNG, CR2, NEF, NRW, ARW, RW2, ORF, PEF, SRW and RAF files). Under the hood, this was a major restructuring, removing all native dependencies and building an extensive test suite to ensure that everything actually works.
For apps that receive images from other apps with a content:// URI (such as those sent by apps that target API 24 or higher), ExifInterface now works directly off of an InputStream; this allows you to easily extract Exif information directly out of content:// URIs you receive without having to create a temporary file.
content://
InputStream
Uri uri; // the URI you've received from the other app InputStream in; try { in = getContentResolver().openInputStream(uri); ExifInterface exifInterface = new ExifInterface(in); // Now you can extract any Exif tag you want // Assuming the image is a JPEG or supported raw format } catch (IOException e) { // Handle any errors } finally { if (in != null) { try { in.close(); } catch (IOException ignored) {} } }
Note: ExifInterface will not work with remote InputStreams, such as those returned from a HttpURLConnection. It is strongly recommended to only use them with content:// or file:// URIs.
HttpURLConnection
file://
For most attributes, you'd simply use the getAttributeInt(), getAttributeDouble(), or getAttribute() (for Strings) methods as appropriate.
getAttributeInt()
getAttributeDouble()
getAttribute()
One of the most important attributes when it comes to displaying images is the image orientation, stored in the aptly-named TAG_ORIENTATION, which returns one of the ORIENTATION_ constants. To convert this to a rotation angle, you can post-process the value.
TAG_ORIENTATION
ORIENTATION_
int rotation = 0; int orientation = exifInterface.getAttributeInt( ExifInterface.TAG_ORIENTATION, ExifInterface.ORIENTATION_NORMAL); switch (orientation) { case ExifInterface.ORIENTATION_ROTATE_90: rotation = 90; break; case ExifInterface.ORIENTATION_ROTATE_180: rotation = 180; break; case ExifInterface.ORIENTATION_ROTATE_270: rotation = 270; break; }
There are some helper methods to extract values from specific Exif tags. For location data, the getLatLong() method gives you the latitude and longitude as floats and getAltitude() will give you the altitude in meters. Some images also embed a small thumbnail. You can check for its existence with hasThumbnail() and then extract the byte[] representation of the thumbnail with getThumbnail() - perfect to pass to BitmapFactory.decodeByteArray().
getLatLong()
getAltitude()
hasThumbnail()
byte[]
getThumbnail()
BitmapFactory.decodeByteArray()
One thing that is important to understand with Exif data is that there are no required tags: each and every tag is optional - some services even specifically strip Exif data. Therefore throughout your code, you should always handle cases where there is no Exif data, either due to no data for a specific attribute or an image format that doesn't support Exif data at all (say, the ubiquitous PNGs or WebP images).
Add the ExifInterface Support Library to your project with the following dependency:
compile "com.android.support:exifinterface:25.1.0"
But when an Exif attribute is exactly what you need to prevent a mis-rotated image in your app, the ExifInterface Support Library is just what you need to #BuildBetterApps
Today at Google Developer Day China, we are happy to announce a developer preview of Android Wear 2.0 for developers creating apps for China. Android Wear 2.0 is the biggest update since our partners launched their first devices in China last year.
We're making a Developer Preview available today and plan to release additional updates in the coming months. Please send us your feedback by filing bugs or posting in our Android Wear Developers community.
With Android Wear 2.0, apps can access the internet directly on Android Wear devices. As a result, for the majority of apps, having a companion phone application is no longer necessary. This means that most developers creating apps for Android Wear 2.0 may no longer need to import the Google Play services library.
There are two situations where developers will need to import Google Play services for China:
You can find more details about how to import the China compatible version of Google Play services library here.
The Android Wear 2.0 Developer Preview includes an updated SDK with tools, and system images for testing using the Huawei Watch.
To get started, follow these steps:
We will update this developer preview over the next few months based on your feedback. The sooner we hear from you, the more we can include in the final release, so don't be shy!
编辑: 林海泉, Android Wear 开发平台负责人
今天在上海举办的Google 开发者大会上,我们正式宣布了一款专门针对中国市场的Android Wear 2.0 开发者预览版。Android Wear 2.0系统,将是自我们的合作伙伴首次发布手表产品以来最重大的更新。
开发者预览版已于今日正式上线。与此同时,我们也计划在未来的几个月内持续进行更新。请您将您遇到的问题在此提交反馈,或者在我们的Android Wear开发者论坛发表意见。
在Android Wear 2.0系统中,应用可以由Android Wear手表直接连接至互联网。因此,对于大多数应用来说,手机端的伴侣应用也就变得不再必要。这也意味着,多数为Android Wear 2.0开发应用的开发者将不再需要引用Google Play services客户端库。
目前,在两个情况下开发者仍然需要引入Google Play Services客户端库来为中国市场开发应用:
您可以在这里找到关于如何引入与中国版兼容的Google Play service的更多信息。
Android Wear 2.0 开发者预览版包括最新的SDK套件,手表测试系统镜像(基于华为手表)。
情按照以下步骤进行测试:
我们会根据您的反馈在未来的几个月中更新开发者预览版。您给我们的反馈越早,我们将会在最终的发布版本中包含更多针对您的反馈的解决方案。敬请期待!
A key part of Android Wear 2.0 is letting watch apps work as standalone apps, so users can respond to messages, track their fitness, and use their favorite apps, even when their phone isn't around. Developer Preview 4 includes a number of new APIs that will help you build more powerful standalone apps.
To make authentication a seamless experience for both Android phone and iPhone users, we have created new APIs for OAuth and added support for one-click Google Sign-in. With the OAuth API for Android Wear, users can tap a button on the watch that opens an authentication screen on the phone. Your watch app can then authenticate with your server side APIs directly. With Google Sign-In, it's even easier. All the user needs to do is select which account they want to authenticate with and they are done.
In addition to paid apps, we have added in-app billing support, to give you another way to monetize your Android Wear app or watch face. Users can authorize purchases quickly and easily on the watch through a 4-digit Google Account PIN. Whether it's new levels in a game or new styles on a watch face, if you can build it, users can buy it.
What if your watch app doesn't work standalone? Or what if it offers a better user experience when both the watch and phone apps are installed? We've been listening carefully to your feedback, and we've added two new APIs (PlayStoreAvailability and RemoteIntent) to help you navigate users to the Play Store on a paired device so they can more easily install your app. Developers can also open custom URLs on the phone from the watch via the new RemoteIntent API; no phone app or data layer is required.
PlayStoreAvailability
RemoteIntent
// Check Play Store is available int playStoreAvailabilityOnPhone = PlayStoreAvailability.getPlayStoreAvailabilityOnPhone(getApplicationContext()); if (playStoreAvailabilityOnPhone == PlayStoreAvailability.PLAY_STORE_ON_PHONE_AVAILABLE) { // To launch a web URL, setData to Uri.parse("https://g.co/wearpreview") Intent intent = new Intent(Intent.ACTION_VIEW) .addCategory(Intent.CATEGORY_BROWSABLE) .setData(Uri.parse("market://details?id=com.google.android.wearable.app")); // mResultReceiver is optional; it can be null. RemoteIntent.startRemoteActivity(this, intent, mResultReceiver); }
Many of you have given us the feedback that the swipe-to-dismiss gesture from Android Wear 1.0 is an intuitive time-saver. We agree, and have reverted back to the previous behavior with this developer preview release. To support swipe-to-dismiss in this release, we've made the following platform and API changes:
SwipeDismissFrameLayout
Additional details are available under the behavior changes section of the Android Wear Preview site.
Android Wear apps packaged using the legacy embedded app mechanism can now be delivered to Android Wear 2.0 watches. When a user installs a phone app that also contains an embedded Android Wear app, the user will be prompted to install the embedded app via a notification. If they choose not to install the embedded app at that moment, they can find it in the Play Store on Android Wear under a special section called "Apps you've used".
Despite support for the existing mechanism, there are significant benefits for apps that transition to the multi-APK delivery mechanism. Multi-APK allows the app to be searchable in the Play Store on Android Wear, to be eligible for merchandising on the homepage, and to be remotely installed from the web to the watch. As a result, we strongly recommend that developers move to multi-APK.
setShouldPeekOnScrollDown
Thanks for all your terrific feedback on Android Wear 2.0. Check out g.co/wearpreview for the latest builds and documentation, keep the feedback coming by filing bugs or posting in our Android Wear Developers community, and stay tuned for Android Wear Developer Preview 5!