Accessibility in Firefox for Android: Some more technical background, Part II

A long while back, I wrote a post explaining some of the more technical details of the implementation of the accessibility in Firefox for Android. If you want to read the whole post, feel free to do so and then come back here, but for those of you who don’t, here is a short recap of the most important points:

  1. What made the accessibility possible at all in the first place was the fact that Firefox for Android started to have a native Android UI instead of a custom XUL one.
  2. The only thing that needed to be made accessible was the custom web view we’re using, all the rest of the browser UI gained accessibility from using native Android widgets.
  3. The switch to a native UI also gave us the possibility to talk directly to TalkBack and other assistive technology apps.
  4. At the core is the well-known accessibility API also used on the desktop, written in C++. On top of that sits a JavaScript layer, code-named AccessFu, which pulls information from that layer and generates TalkBack events to make everything speak. It also receives the keyboard commands from the Android side, and as we’ll see below, has been substantially extended to also include touch gestures.
  5. There is now also an extended layer of accessibility code on the native Android layer, which I’ll come to below.

Making Explore By Touch work

After that last post, we had to make Explore By Touch work, for both Ice Cream Sandwich and, as it was released shortly after, the all-new Jelly Bean and beyond accessibility enhancements. For that, the JavaScript layer receives touch events from the Android side of things and queries the core engine for the element at the touch point. Also gestures like tapping, swiping, double-tapping and two-finger scrolling are received and processed accordingly.

For Jelly Bean and beyond, we had to do a special trick to make left and right swipes work. Because we’re implementing everything ourselves, we have to fake previous and next accessible elements, making TalkBack and others believe they have true native accessible elements to the left and right of the current element. TalkBack, in effect, only knows about three accessibles on the page. The currently spoken one is always the one in the middle of these three. When the user swipes right to go to the next element, the element to the right becomes the new middle one, gets spoken, and at the same time, the previous middle one becomes the one to the left, and the next regular page element is fetched and becomes the new right element. This particular logic sits in our Android accessibility code and queries the JavaScript portion for the relevant accessibles.

The fact that we have so much control over this stuff, in fact we have to do it this way or it wouldn’t work, allowed us to port the swiping back to Ice Cream Sandwich AKA Android 4.0, which doesn’t natively support that, and even Gingerbread AKA Android 2.3, which doesn’t have Explore By Touch support at all. But in the Firefox web views, it works! Including swiping and double-tapping, two-finger scrolling etc. Unfortunately, there was no way to make the rest of the browser UI accessible by touch on Gingerbread, too, so you’ll still have touse the keyboard for that.

On Jelly Bean and above, we are restricted a bit by what gestures Android supports. In effect, we cannot change the swiping, dluble-tapping, exploring, or two-finger scrolling behavior. We also cannot change what happens when the user swipes up and down. In those instances, Android expects a certain behavior, and we have to provide it. This is why, despite popular request, we currently cannot change the 3 finger swipes left and right to something more comfortable to execute quick navigation. Single-finger swipes left and right are strictly reserved for single object navigation. We’d love it to be different, but we’re bound in this case.

BrailleBack support

Some of the above techniques were used, in a slightly different fashion, to also implement BrailleBack support. As for TalkBack and friends, we have to implement everything ourselves. You have to implement two protocols: com.googlecode.eyes-free.selfbraille.SelfBrailleClient and com.googlecode.eyes-free.selfbraille.WriteData. This isn’t documented anywhere. Our summer intern Max Li did a terrific job dissecting the BrailleBack code and grabbing the relevant pieces from the GoogleCode project, and the result can be seen in Firefox for Android 25 and later. Max also added separate braille utterances, so the output isn’t as verbose as for speech, and follows better logic for braille readers. A few weeks ago, a review of using Android with braille was posted on Chris Hofstader’s blog by a deaf-blind user highlighting how well he could work with Firefox for Android in large part because of its excellent braille support. To reiterate: max was a summer intern at Mozilla last year. He is sighted and had never been in contact with braille before this as far as I know. He implemented this all by himself, occasionally asking me for feedback only. Imagine that, and then getting this review! I’m proud of you, Max!

And Max didn’t stop there: In Firefox 29 and above, an improvements to the way unordered and numbered lists are being presented in braille.

Much of all of this is good for Firefox OS, too

Because of the layered nature of our code, much of what has been implemented for Firefox for Android can be re-used in Firefox OS as well. The differences are mainly in what comes on top of the JavaScript/AccessFu layer. Talking to a synth directly instead of generating TalkBack events, of course using the new WebSpeech API, and in the future we’ll also “only” need to implement BrlTTY or something similar support and connectivity for braille displays. The abstraction allows us to then put the utterances to good use there as well. The main problem we’re having with Firefox OS right now is the actual UI written in HTML, JS, and CSS, code-named Gaia. Getting all the screens right so you cannot swipe to where you’re not supposed to at any given moment, making everything react to the proper activation events etc., and teaching the Gaia team a lot about accessibility along the way are the major tasks we’re working on for that right now. But thanks to the layering of the accessibility APIs and implementations, the transition from Firefox for Android was, though not trivial, not the biggest of our problems on Firefox OS. 🙂

In summary

The Android API documentation was not much help with all of this. As mentioned, the portion about SelfBrailleClient and friends wasn’t documented anywhere at all, at least I didn’t find anything but source-code references, among them that of Firefox for Android, in a Google search. But also the exact expectations of TalkBack aren’t retrievable just by studying the docs. Eitan had to dive into TalkBack’s source code to actually understand what it was expecting of us to deliver to make it all work.

Here’s to hoping that Google, despite its quest to close-source Android more and more, will keep BrailleBack and TalkBack open sourced so we can continue to read its source code in future updates to keep providing the good support our users have come to expect from us.

New features for TalkBack users in Firefox for Android 24

Today, I have some exciting news to share with you! Firefox for Android 24 will bring a lot of noticeable improvements for users of TalkBack and alternative Android screen readers. Here’s a round-up of those enhancements and new features:


Firefox 24 brings BrailleBack support. The braille output is not merely a mirror of the speech output, but rather a smart representation of the current and surrounding items. You can use the navigational commands of your display to navigate web content, and you can use routing keys to activate the current item. This may activate a link, press a button, toggle a check box, or whatever action an ordinary double-tap or press of the D-pad Enter would also execute.

New default reading order

A new order of reading information has been introduced, which may soon become the default reading order for new installs. When activated and TalkBack speaks the current item, it now speaks the name, label or text first, followed by semantic information such as the type (like link, button etc.) and the ancestry. This way, the most important information is spoken first.

If you want to try the new reading order, you can do this by following these steps:

  1. In the address bar, type about:config and press enter.
  2. In the search field, type access.
  3. Look for the entry named accessfu.utterance and double-tap it.
  4. Change its value from 0 to 1.
  5. Close the tab.

Now, your reading order will be the new one with the label of the current item first. To change back to the way it was before, follow the above steps and change the value back from 1 to 0.

Editing text

Editing text is finally possible in text inputs and text areas of web pages! You can navigate by character, word, and in text areas, paragraph, too. With this, having control over what you’ve written and correcting what you typed is finally possible! I know you were waiting a long time for this! 🙂

Text areas now activate

It was previously not possible to activate many text areas and get a keyboard to input text. This has been fixed. Examples of this bug could be seen in Facebook and Mobile Twitter when trying to compose a status update or tweet.

Skipping of decorative images

Images which web authors clearly marked as decorative with the expression alt=”” are now properly skipped when navigating through web content. This should eliminate a lot of unnecessary stops on graphics which have no meaning for the understanding of the content.

Better speaking of rich labeled content

Firefox for Android now properly exposes explicitly given labels of web content instead of always trying to get the label from the inner text of that element. If an aria-label or other means of labelling an element is being used, for example on Facebook or Twitter, it will now be spoken and brailled.

Popups are announced

If the activation of an element causes a pop up to appear, this will now be announced. Examples of that are the items for Friend requests, Messages, and Notifications on Facebook.

Closing all tabs no longer breaks web content support

There was a bug that caused all web content to no longer be accessible once all tabs had been closed. You had to close and restart Firefox for Android to get TalkBack support working again. This has been fixed.

Double-tap and hold in web content

There was a bug that you could not use the double-tap and hold gesture to bring up the context menu for any given web element with TalkBack. This has been fixed, and you can now access all the features of those context menus.

Door hangers retain focus

There was a bug in the door hanger notifications, like the Save Password prompt, that caused focus to get lost frequently. This has been fixed.

More changes

There have been quite some more changes under the hood you will not notice immediately, but which all add to a much better user experience.


A big thank you goes out to Max, who is a summer intern at Mozilla this year, and to Yura, who is a very active community contributor. And of course to Eitan, whose main work is concentrated on getting Firefox OS accessibility in shape, but who always finds the energy to keep the rest of us in check when we hack on Firefox for Android! 🙂

So when will you get it?

Firefox 24 will hit the Play Store in August as Firefox Beta, and in September as the final release. If you’re on Firefox Nightly builds, you already have all these great enhancements on your device. And if you’re using Aurora, you’ll get it in the week of June 24, 2013.

We hope that you’ll enjoy these enhancements and definitely value your feedback! Also keep those bug reports coming, they are very helpful and will assist us in making Firefox for Android and Firefox OS even better!