[This post is by Jesse Wilson from the Dalvik team. —Tim Bray]
Using XmlPullParser is an efficient and maintainable way to parse XML on Android. Historically Android has had two implementations of this interface:
KXmlParser, via XmlPullParserFactory.newPullParser().
ExpatPullParser, via Xml.newPullParser().
The implementation from Xml.newPullParser() had a bug where calls to nextText() didn’t always advance to the END_TAG as the documentation promised it would. As a consequence, some apps may be working around the bug with extra calls to next() or nextTag():
Xml.newPullParser()
nextText()
END_TAG
next()
nextTag()
public void parseXml(Reader reader) throws XmlPullParserException, IOException { XmlPullParser parser = Xml.newPullParser(); parser.setInput(reader); parser.nextTag(); parser.require(XmlPullParser.START_TAG, null, "menu"); while (parser.nextTag() == XmlPullParser.START_TAG) { parser.require(XmlPullParser.START_TAG, null, "item"); String itemText = parser.nextText(); parser.nextTag(); // this call shouldn’t be necessary! parser.require(XmlPullParser.END_TAG, null, "item"); System.out.println("menu option: " + itemText); } parser.require(XmlPullParser.END_TAG, null, "menu"); } public static void main(String[] args) throws Exception { new Menu().parseXml(new StringReader("<?xml version='1.0'?>" + "<menu>" + " <item>Waffles</item>" + " <item>Coffee</item>" + "</menu>")); }
In Ice Cream Sandwich we changed Xml.newPullParser() to return a KxmlParser and deleted our ExpatPullParser class. This fixes the nextTag() bug. Unfortunately, apps that currently work around the bug may crash under Ice Cream Sandwich:
org.xmlpull.v1.XmlPullParserException: expected: END_TAG {null}item (position:START_TAG <item>@1:37 in java.io.StringReader@40442fa8) at org.kxml2.io.KXmlParser.require(KXmlParser.java:2046) at com.publicobject.waffles.Menu.parseXml(Menu.java:25) at com.publicobject.waffles.Menu.main(Menu.java:32)
The fix is to call nextTag() after a call to nextText() only if the current position is not an END_TAG:
while (parser.nextTag() == XmlPullParser.START_TAG) { parser.require(XmlPullParser.START_TAG, null, "item"); String itemText = parser.nextText(); if (parser.getEventType() != XmlPullParser.END_TAG) { parser.nextTag(); } parser.require(XmlPullParser.END_TAG, null, "item"); System.out.println("menu option: " + itemText); }
The code above will parse XML correctly on all releases. If your application uses nextText() extensively, use this helper method in place of calls to nextText():
private String safeNextText(XmlPullParser parser) throws XmlPullParserException, IOException { String result = parser.nextText(); if (parser.getEventType() != XmlPullParser.END_TAG) { parser.nextTag(); } return result; }
Moving to a single XmlPullParser simplifies maintenance and allows us to spend more energy on improving system performance.
Today we are announcing Android 4.0.3, an incremental release of the Android 4.0 (Ice Cream Sandwich) platform. The new release includes a variety of optimizations and bug fixes for phones and tablets, as well as a small number of new APIs for developers. The new API level is 15.
Some of the new APIs in Android 4.0.3 include:
Social stream API in Contacts provider: Applications that use social stream data such as status updates and check-ins can now sync that data with each of the user’s contacts, providing items in a stream along with photos for each. This new API lets apps show users what the people they know are doing or saying, in addition to their photos and contact information.
Calendar provider enhancements. Apps can now add color to events, for easier tracking, and new attendee types and states are now available.
New camera capabilities. Apps can now check and manage video stabilization and use QVGA resolution profiles where needed.
Accessibility refinements. Improved content access for screen readers and new status and error reporting for text-to-speech engines.
Incremental improvements in graphics, database, spell-checking, Bluetooth, and more.
For a complete overview of what’s new in the platform, see the Android 4.0.3 API Overview.
Going forward, we’ll be focusing our partners on Android 4.0.3 as the base version of Ice Cream Sandwich. The new platform will be rolling out to production phones and tablets in the weeks ahead, so we strongly encourage you to test your applications on Android 4.0.3 as soon as possible.
We would also like to remind developers that we recently released new version of the SDK Tools (r16) and of the Eclipse plug-in (ADT 16.0.1). We have also updated the NDK to r7.
Visit the Android Developers site for more information about Android 4.0.3 and other platform versions. To get started developing or testing on the new platform, you can download it into your SDK using the Android SDK Manager.
[This post is by Reto Meier, Android Developer Relations Tech Lead. — Tim Bray]
Today I’m thrilled to announce the beta launch of Android Training — a collection of classes that we hope will help you to build better Android apps.
From designing effective navigation, to managing audio playback, to optimizing battery life, these classes are designed to demonstrate best practices for solving common Android development problems.
Each class explains the steps required to solve a problem, or implement a feature, with plenty of code snippets and sample code for you to use within your own apps.
We’re starting small and this is just the beginning for Android Training. Over the coming months we will be increasing the number of classes available, as well as introducing over-arching courses and sample apps to further help your development experience.
Helping developers build great apps is what the Android Developer Relations team is all about, so we’re excited to see how you use these classes to make your apps even better.
We’d love to know what you think of these classes, and what classes you’d like to see next.
[This post is by Dan Galpin, who lives the Android Games lifestyle every day. — Tim Bray]
Making a game on Android is easy. Making a great game for a mobile, multitasking, often multi-core, multi-purpose system like Android is trickier. Even the best developers frequently make mistakes in the way they interact with the Android system and with other applications — mistakes that don’t affect the quality of gameplay, but which affect the quality of the user’s experience in other ways.
A truly great Android game knows how to play nice: how to fit seamlessly into the system of apps, services, and UI features that run on Android devices. In this multi-part series of posts, Android Developer Relations engineers who specialize in games explain what it takes to make your game play nice.
Android users get used to using the back key. We expect the volume keys to work in some intuitive fashion. We expect that the home key behaves in a manner consistent with the Android navigation paradigm. Sometimes we even expect the menu key to do something.
I’m playing [insert favorite game here] and I accidentally hit the [Home] key or the [Back] key. This is probably happening because I’m furiously using the touchscreen to actually play the game. Whether I’ve been cutting ropes, controlling aircraft, cleaving fruit, or flinging birds, I’m almost certainly angry if I’ve suddenly lost all of my game progress.
Lots of developers assume that pressing the Home key exits a game. Perhaps this is because on some mobile devices the Home key is a somewhat-difficult-to-press physical button. Depending on the device and Android release, it might be a physical, capacitive, or soft button. This means that it is relatively easy to hit accidentally. Having progress lost by such an event as an incoming call is even worse.
Save as much about the status of the game into the Bundle in onSaveInstanceState() as you can. This helper function will get called whenever your application receives an onPause() callback. Note that you can save byte arrays into that bundle, so it can easily be used for raw data.
If your game takes lots of native system resources, consider dumping large textures (or all textures and geometry) during onPause() or onStop(). (GLSurfaceView will do this automatically unless you tell it not to — at least you can tell it not to do so starting in API level 11). This will help your title continue to reside in memory, which will typically speed task-switching back to your game for very large titles that might otherwise be swapped out of memory, but may slow things down for smaller titles that can more efficiently multitask if they don’t bother to do this.
When your game resumes, restore the state from the bundle in onRestoreInstanceState(). If there is any sort of time-consuming loading that has to be done, make sure that you notify the user of what is happening to give them the best possible experience.
I’m in the middle of playing a game and I hit the back key. One of several bad things can happen here:
The game exits immediately, losing all state and leading to Angry User Syndrome. (see Problem 1).
The game does nothing.
We already know what is wrong with scenario 1. It’s essentially a data loss scenario, and it’s worse than pigs stealing your eggs. What is wrong with scenario 2?
The [Back] key is an essential part of the Android navigation paradigm. If the back key doesn’t return to the previous screen in the activity stack (or in the game hierarchy) there better be a very good reason, such as an active document with no capability to save a draft.
If the user is in the middle of gameplay It is customary to display some sort of dialog asking the user if they intended the action:
“Are you sure you wish to exit now? That monster looks hungry.”
In an extreme action game, you might also wish to do something similar to what Replica Island (RI) did. RI assumed that any [Back] keypress that happened within 200ms of another touch event was invalid in order to make it a bit more challenging to accidentally press the key.
At the Main Menu of the game, you can decide whether it makes sense to prompt the user or not. If your game has very long load times, you might want to prompt the user.
There’s nothing worse than wanting to settle down for a good session of [insert favorite game here] in some sort of public place with your volume turned up. Suddenly everyone has learned that you prefer pummelling produce to predicting present progressions and that’s practically profane in your profession.
By default, volume keys in most Android devices will control the ringer volume, and your application must pass the volume keys through to the super class so this continues to work.
In order to make these keys control the music volume (which is the channel that your game will be using), you need to call setVolumeControlStream(AudioManager.STREAM_MUSIC). As stated previously, all you need to do is pass these keys through to the framework and you’ll get control of the audio in the standard and proper way. Do it as early as possible so a user can start changing the volume far before you begin playing anything.
[This post is by Luca Zanolin, an Android engineer who works on voice typing. — Tim Bray]
A new feature available in Android 4.0 is voice typing: the difference for users is that the recognition results appear in the text box while they are still speaking. If you are an IME developer, you can easily integrate with voice typing.
To simplify the integration, if you download this library and modify your IME as described below, everything will work smoothly on any device with Android 2.2 or later. On 4.0+, users will get voice typing, and earlier versions will use standard voice recognition; the difference is illustrated below.
To see how to integrate voice typing you can take a look at this sample IME. The IME is really simple and contains only one button: a microphone. By pressing the microphone, the user triggers voice recognition.
Here are the steps that you need to follow to integrate voice recognition into your IME.
Download this library and add it to your IME APK.
The library contains the VoiceRecognitionTrigger helper class. Create an instance of it inside the InputMethodService#onCreate method in your IME.
public void onCreate() { super.onCreate(); ... mVoiceRecognitionTrigger = new VoiceRecognitionTrigger(this); }
You need to modify the UI of your IME, add a microphone icon, and register an OnClickListener to trigger voice recognition. You can find the assets inside the sample IME. The microphone icon should be displayed only if voice recognition is installed; use VoiceRecognitionTrigger#isInstalled().
public View onCreateInputView() { LayoutInflater inflater = (LayoutInflater) getSystemService( Service.LAYOUT_INFLATER_SERVICE); mView = inflater.inflate(R.layout.ime, null); ... mButton = (ImageButton) mView.findViewById(R.id.mic_button); if (mVoiceRecognitionTrigger.isInstalled()) { mButton.setOnClickListener(new OnClickListener() { @Override public void onClick(View v) { mVoiceRecognitionTrigger.startVoiceRecognition(); } }); mButton.setVisibility(View.VISIBLE); } else { mButton.setVisibility(View.GONE); } return mView; }
If your IME supports multiple languages, you can specify in which language recognition should be done as a parameter of startVoiceRecognition().
When your IME starts, you need to notify the trigger, so it can insert into the text view any pending recognition results.
@Override public void onStartInputView(EditorInfo info, boolean restarting) { super.onStartInputView(info, restarting); if (mVoiceRecognitionTrigger != null) { mVoiceRecognitionTrigger.onStartInputView(); } }
In order to start a voice recognition through the Intent API, the library uses a service and an activity, and you need to add them into your manifest.
<manifest ... > <application ...> ... <service android:name="com.google.android.voiceime.ServiceHelper" /> <activity android:name="com.google.android.voiceime.ActivityHelper" android:theme="@android:style/Theme.Translucent.NoTitleBar" android:excludeFromRecents="true" android:windowSoftInputMode="stateAlwaysHidden" android:finishOnTaskLaunch="true" android:configChanges="keyboard|keyboardHidden|navigation |orientation"/> </application> </manifest>
This step is optional, but you should implement it if possible as it will improve the user experience. Voice recognition requires network access, and if there is no network, your IME should notify the user that voice recognition is currently disabled. To achieve this, you need to register the VoiceRecognitionTrigger.Listener and enable/disable the microphone accordingly.
The listener is registered in InputMethodService#onCreate, and you have to unregister it in InputMethodService#onDestroy, otherwise you will leak the listener.
@Override public void onCreate() { super.onCreate(); ... mVoiceRecognitionTrigger = new VoiceRecognitionTrigger(this); mVoiceRecognitionTrigger.register(new VoiceRecognitionTrigger.Listener() { @Override public void onVoiceImeEnabledStatusChange() { updateVoiceImeStatus(); } }); } ... @Override public void onDestroy() { ... if (mVoiceRecognitionTrigger != null) { mVoiceRecognitionTrigger.unregister(this); } super.onDestroy(); } private void updateVoiceImeStatus() { if (mVoiceRecognitionTrigger.isInstalled()) { mButton.setVisibility(View.VISIBLE); if (mVoiceRecognitionTrigger.isEnabled()) { mButton.setEnabled(true); } else { mButton.setEnabled(false); } } else { mButton.setVisibility(View.GONE); } mView.invalidate(); }
And add this permission into your manifest:
<manifest ... > ... <uses-permission android:name="android.permission.ACCESS_NETWORK_STATE"/> ... </manifest>
Voice recognition makes it easy for users to do more with their Android devices, so we appreciate your support in adding it to your IMEs.