Started a 30 days with Android experiment

After I revisited the results of my Switching to Android experiment, and finding most (like 99.5%) items in order now, I decided on Tuesday to conduct a serious 30 days with Android endeavor. I have handed in my iPhone to my wife, and she’s keeping (confiscating) it for me.

I will also keep a diary of experiences. However, since that is very specific and not strictly Mozilla-related, and since this blog is syndicated on Planet Mozilla, I have decided to create a new blog especially for that occasion. You’re welcome to follow along there if you’d like, and do feel free to comment[Update August 30, 2014]: The experiment was over after 18 of the 30 days, so the blog linked to above is the archive I am keeping on my new site. The Blogger links that were here previously will disappear, and I encourage to visit the new blog and spread the word!

Also, if you follow me on Twitter, the hashtag #30DaysWithAndroid will be used to denote things around this topic, and announce the new posts as I write them, hopefully every day.

It’s exciting, and I still feel totally crazy about it! 🙂 Well, here goes!

Revisiting the “Switch to Android full-time” experiment

Just over a year ago, I conducted an experiment to see whether it would be possible for me to switch to an Android device full-time for my productive smartphone needs. The conclusion back then was that there were still too many things missing for me to productively switch to Android without losing key parts of my day to day usage.

However, there were several changes over the past 15 or so months that prompted me to revisit this topic. For one, Android itself has been updated to 4.4 Kitkat, and Android L is just around the corner. At least Google Nexus devices since the Nexus 7 2012 get 4.4, and the newer Nexus 7 and Nexus 5 models will most likely even get L.

Second, after an 8 month hiatus since the last release, TalkBack development continues, with the distribution for the beta versions being put on a much more professional level than previously. No more installing of an APK from a Google Code web site!

Also, through my regular work on testing Firefox for Android, I needed to stay current on a couple of Android devices anyway, and noticed that gesture recognition all over the place has improved dramatically in recent versions of Android and TalkBack 3.5.1.

So let’s revisit my points of criticism and see where these stand today! Note that since posting this originally on August 3, 2014, there have been some great tips and hints shared both on here as well as on the eyes-free mailing list, and I’ve now updated the post to reflect those findings.

The keyboard

Two points of criticism I had were problems with the keyboard. Since TalkBack users are more or less stuck on the Google Keyboard that comes with stock Android, there were a few issues that I could not resolve by simply using another keyboard. Both had to do with languages:

  1. When having multiple languages enabled, switching between them would not speak which language one would switch to.
  2. It was not possible to enter German umlauts or other non-English accented characters such as the French accented e (é).

Well what can I say? The Google Keyboard recently got an update, and that solves the bigger of the two problems, namely the ability to enter umlauts and accented characters. The button to switch languages does not say yet which language one would switch to, but after switching, it is clearly announced.

To enter accented characters, one touches and keeps the finger on the base character, for example the e, and waits for the TalkBack announcement that special characters are available. Now, lift your finger and touch slightly above where you just were. If you hit the field where the new characters popped up, you can explore around like on the normal keys, and lift to enter. If you move outside, however, the popup will go away, and you have to start over. This is also a bit of a point of criticism I have: The learning curve at the beginning is a bit high, and one can dismiss the popup by accident very easily. But hey, I can enter umlauts now! 🙂

Another point is that the umlaut characters are sometimes announced with funny labels. The German a umlaut (ä), for example, is said to have a label of “a with threema”. Same for o and u umlauts. But once one gets used to these, entering umlauts is not really slower than on iOS if you don’t use the much narrower keyboard with umlauts already present.

This new version of the keyboard also has some other nice additions: If Android will auto-correct what you typed, space, and many relevant punctuation characters will now say that your text will be changed to the auto-corrected text once you enter this particular symbol. This is really nice since you immediately can explore to the top of the keyboard and maybe find another auto-correct suggestion, or decide to keep your original instead.

The new version of the keyboard was not offered to me as a regular software update in the Play Store for some reason. Maybe because I had never installed a Play Store version before. I had to explicitly bring it up in Google Play and tell the device to update it. If you need it, you can browse to the link above on the desktop and tell Google Play to send it to your relevant devices. 😉

Two points gone!

Editing text

The next point of criticism I had was the lack of ability to control the cursor reliably, and to use text editing commands. Guess what? TalkBack improved this a lot. When on a text field, the local context menu (swipe up then right in one motion) now includes a menu for cursor control. That allows one to move the cursor to the beginning or the end, select all, and enter selection mode. Once in selection mode, swiping left and right will actually select and deselect text. TalkBack is very terse here, so you have to remember what you selected and what you deselected, but with a bit of practice, it is possible. Afterwards, once some text is selected, this context menu for cursor control not only contains an item to end selection mode, but also to copy or cut the text you selected. Once something is in the clipboard, a Paste option becomes available, too. Yay! That’s another point taken off the list of criticism!

Continuous reading of lists

TalkBack added the ability to read lists continuously by scrolling them one screen forward or back once you swiped to the last or first currently visible item. In fact, they did so not long after my original post. This is another point that can be taken off my list.

The German railway navigator app

This app has become accessible. They came out with a new version late last year that unified the travel planning and ticketting apps, and one can now find connections, book the tickets and use the Android smartphone to legitimize oneself on the train from start to finish on Android like it is possible on iOS. Another point off my list!

eBooks

While the Google Play Books situation hasn’t changed much, the Amazon Kindle app became fully accessible on Android, closely following the lead on iOS. This is not really a surprise, since FireOS, Amazon’s fork of Android, became much mure accessible in its 3.0 release in late 2013, and that included the eBook reader.

Calendaring

This was originally one of the problematic points left, however comments here and in the eyes-free list lead to a solution that is more than satisfactory. I originally wrote:

The stock Calendar app is still confusing as hell. While it talks better somewhat, I still can’t make hells of tales out of what this app wants to tell me. I see dates and appointments intermixed in a fashion that is not intuitive to me. Oh yes, it talks much better than it used to, but the way this app is laid out is confusing the hell out of me. I’m open to suggestions for a real accessible calendar app that integrates nicely with the system calendars coming from my google account.

As Ana points out in a reply on eyes-free, the solution here is to switch the Calendar to Agenda View. One has to find the drop down list at the top left of the screen, which irritatingly says “August 2014” (or whatever month and year one is in), and which lead me to believe this would open a date picker of some sort. But indeed it opens a small menu, and the bottom-most item is the Agenda View. Once switched to it, appointments are nicely sorted by day, their lengths are given and all other relevant info is there, too. Very nice! And actually quite consistent with the way Google Calendar works on the web, where Agenda View is also the preferred view.

And if one’s calendaring needs aren’t covered by the Google Calendar app, I was recommended Business Calendar as an alternative.

Currency recognition

Originally, I wrote:

The Ideal Currency Identifier still only recognizes U.S. currency, and I believe that’ll stay this way, since it hasn’t seen an update since September 2012. Any hints for any app that recognizes Canadian or British currencies would be appreciated!

I received three suggestions: Darwin Wallet, Blind-Droid Wallet, and Goggles. All three work absolutely great for my purposes!

Navigation and travel

I wrote originally:

Maps are still not as accessible as on iOS, but the Maps app got better over time. I dearly miss an app like BlindSquare that allows me to identify junctions ahead and other marks in my surroundings to properly get oriented. I’m open to suggestions for apps that work internationally and offer a similar feature set on Android!

I received several suggestions, among them DotWalker and OsmAnd. Haven’t yet had  a chance to test either, but will do so over the course of the next couple of days. The feature sets definitely sound promising!

One thing I am definitely missing, when comparing Google Maps to Apple Maps accessibility, is the ability to explore the map with my finger. iOS has had this feature since version 6, which was released in 2012, where you can explore the map on the touch screen and get not only spoken feedback, but can trace streets and junctions with your finger, too, getting an exact picture of whichever area you’re looking at. I haven’t found anything on Android yet that matches this functionality, so I’ll probably keep an iPad around and tether it if I need to look at maps in case my next main phone is indeed an Android one.

Ana also recommended The vOICe for Android as an additional aid. However, this thing is so much more powerful in other senses, too, that it probably deserves an article of its own. I will definitely have a look (or listen) at this one as well!

Other improvements not covered in the original post

Android has seen some other improvements that I didn not specifically cover in my original post, but which are worth mentioning. For example, the newer WebViews became more accessible in some apps, one being the German railway company app I mentioned above. Some apps like Gmail even incorporate something that sounds a lot like ChromeVox. The trick is to swipe to that container, not touch it, then ChromeVox gets invoked properly. With that, the Gmail app is as usable on Android now as the iOS version has been since a major update earlier this year. It is also no longer necessary to enable script improvements to web views in the Accessibility settings. This is being done transparently now.

Other apps like Google+ actually work better on Android even than on iOS. 😉

Other third-party apps like Facebook and Twitter have also seen major improvements to their Android accessibility stories over the past year, in large parts due to the official formation of dedicated accessibility teams in those companies that train the engineers to deliver good accessible experience in new features they introduce.

And one thing should also be positively mentioned: Chrome is following the fox’s lead and now supports proper Explore By Touch and other real native Android accessibility integration. This is available in Chrome 35 and later.

Things that are still bummer points

There is really only one item from the blockquoted items of my original post that is still problematic, and that is the Gallery. The built-in Camera on my Nexus 5 talks much better indeed than it used to, I can now find out whether I’m taking a picture, shooting a video etc. The problematic part is if I want to go back and share the media I took later. The Gallery doesn’t give me any info on the different items I have there. This also includes apps like ABBYY TextGrabber, where if I wanted to choose a photo from the gallery, I’d be just as lost.

However, as Piotr comments below (on the original post content), other device manufacturers use different skins or apps for the same purpose, and it seems that the Camera and Gallery on the Samsung Galaxy S5 are much more powerful in the accessibility sense than the stock Google-provided ones on the Nexus 5.

In conclusion

Over the past year, Android has become a much more viable alternative for me and possibly other users. The gap is closing, and that is a good thing for user choice and market diversity! Show stoppers? No, there aren’t really any any more. The stuff with the Camera or rather the Gallery is an annoyance, but taking pictures is something I do occasionally, not regularly. So yes, I can now see myself using an Android device, be it a Nexus 5 or maybe a modern Samsung device, as my primary phone.

Welcoming your comments!

Accessibility in Firefox for Android: Some more technical background, Part II

A long while back, I wrote a post explaining some of the more technical details of the implementation of the accessibility in Firefox for Android. If you want to read the whole post, feel free to do so and then come back here, but for those of you who don’t, here is a short recap of the most important points:

  1. What made the accessibility possible at all in the first place was the fact that Firefox for Android started to have a native Android UI instead of a custom XUL one.
  2. The only thing that needed to be made accessible was the custom web view we’re using, all the rest of the browser UI gained accessibility from using native Android widgets.
  3. The switch to a native UI also gave us the possibility to talk directly to TalkBack and other assistive technology apps.
  4. At the core is the well-known accessibility API also used on the desktop, written in C++. On top of that sits a JavaScript layer, code-named AccessFu, which pulls information from that layer and generates TalkBack events to make everything speak. It also receives the keyboard commands from the Android side, and as we’ll see below, has been substantially extended to also include touch gestures.
  5. There is now also an extended layer of accessibility code on the native Android layer, which I’ll come to below.

Making Explore By Touch work

After that last post, we had to make Explore By Touch work, for both Ice Cream Sandwich and, as it was released shortly after, the all-new Jelly Bean and beyond accessibility enhancements. For that, the JavaScript layer receives touch events from the Android side of things and queries the core engine for the element at the touch point. Also gestures like tapping, swiping, double-tapping and two-finger scrolling are received and processed accordingly.

For Jelly Bean and beyond, we had to do a special trick to make left and right swipes work. Because we’re implementing everything ourselves, we have to fake previous and next accessible elements, making TalkBack and others believe they have true native accessible elements to the left and right of the current element. TalkBack, in effect, only knows about three accessibles on the page. The currently spoken one is always the one in the middle of these three. When the user swipes right to go to the next element, the element to the right becomes the new middle one, gets spoken, and at the same time, the previous middle one becomes the one to the left, and the next regular page element is fetched and becomes the new right element. This particular logic sits in our Android accessibility code and queries the JavaScript portion for the relevant accessibles.

The fact that we have so much control over this stuff, in fact we have to do it this way or it wouldn’t work, allowed us to port the swiping back to Ice Cream Sandwich AKA Android 4.0, which doesn’t natively support that, and even Gingerbread AKA Android 2.3, which doesn’t have Explore By Touch support at all. But in the Firefox web views, it works! Including swiping and double-tapping, two-finger scrolling etc. Unfortunately, there was no way to make the rest of the browser UI accessible by touch on Gingerbread, too, so you’ll still have touse the keyboard for that.

On Jelly Bean and above, we are restricted a bit by what gestures Android supports. In effect, we cannot change the swiping, dluble-tapping, exploring, or two-finger scrolling behavior. We also cannot change what happens when the user swipes up and down. In those instances, Android expects a certain behavior, and we have to provide it. This is why, despite popular request, we currently cannot change the 3 finger swipes left and right to something more comfortable to execute quick navigation. Single-finger swipes left and right are strictly reserved for single object navigation. We’d love it to be different, but we’re bound in this case.

BrailleBack support

Some of the above techniques were used, in a slightly different fashion, to also implement BrailleBack support. As for TalkBack and friends, we have to implement everything ourselves. You have to implement two protocols: com.googlecode.eyes-free.selfbraille.SelfBrailleClient and com.googlecode.eyes-free.selfbraille.WriteData. This isn’t documented anywhere. Our summer intern Max Li did a terrific job dissecting the BrailleBack code and grabbing the relevant pieces from the GoogleCode project, and the result can be seen in Firefox for Android 25 and later. Max also added separate braille utterances, so the output isn’t as verbose as for speech, and follows better logic for braille readers. A few weeks ago, a review of using Android with braille was posted on Chris Hofstader’s blog by a deaf-blind user highlighting how well he could work with Firefox for Android in large part because of its excellent braille support. To reiterate: max was a summer intern at Mozilla last year. He is sighted and had never been in contact with braille before this as far as I know. He implemented this all by himself, occasionally asking me for feedback only. Imagine that, and then getting this review! I’m proud of you, Max!

And Max didn’t stop there: In Firefox 29 and above, an improvements to the way unordered and numbered lists are being presented in braille.

Much of all of this is good for Firefox OS, too

Because of the layered nature of our code, much of what has been implemented for Firefox for Android can be re-used in Firefox OS as well. The differences are mainly in what comes on top of the JavaScript/AccessFu layer. Talking to a synth directly instead of generating TalkBack events, of course using the new WebSpeech API, and in the future we’ll also “only” need to implement BrlTTY or something similar support and connectivity for braille displays. The abstraction allows us to then put the utterances to good use there as well. The main problem we’re having with Firefox OS right now is the actual UI written in HTML, JS, and CSS, code-named Gaia. Getting all the screens right so you cannot swipe to where you’re not supposed to at any given moment, making everything react to the proper activation events etc., and teaching the Gaia team a lot about accessibility along the way are the major tasks we’re working on for that right now. But thanks to the layering of the accessibility APIs and implementations, the transition from Firefox for Android was, though not trivial, not the biggest of our problems on Firefox OS. 🙂

In summary

The Android API documentation was not much help with all of this. As mentioned, the portion about SelfBrailleClient and friends wasn’t documented anywhere at all, at least I didn’t find anything but source-code references, among them that of Firefox for Android, in a Google search. But also the exact expectations of TalkBack aren’t retrievable just by studying the docs. Eitan had to dive into TalkBack’s source code to actually understand what it was expecting of us to deliver to make it all work.

Here’s to hoping that Google, despite its quest to close-source Android more and more, will keep BrailleBack and TalkBack open sourced so we can continue to read its source code in future updates to keep providing the good support our users have come to expect from us.