Accessibility in Firefox for Android: Some more technical background, Part II

A long while back, I wrote a post explaining some of the more technical details of the implementation of the accessibility in Firefox for Android. If you want to read the whole post, feel free to do so and then come back here, but for those of you who don’t, here is a short recap of the most important points:

  1. What made the accessibility possible at all in the first place was the fact that Firefox for Android started to have a native Android UI instead of a custom XUL one.
  2. The only thing that needed to be made accessible was the custom web view we’re using, all the rest of the browser UI gained accessibility from using native Android widgets.
  3. The switch to a native UI also gave us the possibility to talk directly to TalkBack and other assistive technology apps.
  4. At the core is the well-known accessibility API also used on the desktop, written in C++. On top of that sits a JavaScript layer, code-named AccessFu, which pulls information from that layer and generates TalkBack events to make everything speak. It also receives the keyboard commands from the Android side, and as we’ll see below, has been substantially extended to also include touch gestures.
  5. There is now also an extended layer of accessibility code on the native Android layer, which I’ll come to below.

Making Explore By Touch work

After that last post, we had to make Explore By Touch work, for both Ice Cream Sandwich and, as it was released shortly after, the all-new Jelly Bean and beyond accessibility enhancements. For that, the JavaScript layer receives touch events from the Android side of things and queries the core engine for the element at the touch point. Also gestures like tapping, swiping, double-tapping and two-finger scrolling are received and processed accordingly.

For Jelly Bean and beyond, we had to do a special trick to make left and right swipes work. Because we’re implementing everything ourselves, we have to fake previous and next accessible elements, making TalkBack and others believe they have true native accessible elements to the left and right of the current element. TalkBack, in effect, only knows about three accessibles on the page. The currently spoken one is always the one in the middle of these three. When the user swipes right to go to the next element, the element to the right becomes the new middle one, gets spoken, and at the same time, the previous middle one becomes the one to the left, and the next regular page element is fetched and becomes the new right element. This particular logic sits in our Android accessibility code and queries the JavaScript portion for the relevant accessibles.

The fact that we have so much control over this stuff, in fact we have to do it this way or it wouldn’t work, allowed us to port the swiping back to Ice Cream Sandwich AKA Android 4.0, which doesn’t natively support that, and even Gingerbread AKA Android 2.3, which doesn’t have Explore By Touch support at all. But in the Firefox web views, it works! Including swiping and double-tapping, two-finger scrolling etc. Unfortunately, there was no way to make the rest of the browser UI accessible by touch on Gingerbread, too, so you’ll still have touse the keyboard for that.

On Jelly Bean and above, we are restricted a bit by what gestures Android supports. In effect, we cannot change the swiping, dluble-tapping, exploring, or two-finger scrolling behavior. We also cannot change what happens when the user swipes up and down. In those instances, Android expects a certain behavior, and we have to provide it. This is why, despite popular request, we currently cannot change the 3 finger swipes left and right to something more comfortable to execute quick navigation. Single-finger swipes left and right are strictly reserved for single object navigation. We’d love it to be different, but we’re bound in this case.

BrailleBack support

Some of the above techniques were used, in a slightly different fashion, to also implement BrailleBack support. As for TalkBack and friends, we have to implement everything ourselves. You have to implement two protocols: com.googlecode.eyes-free.selfbraille.SelfBrailleClient and com.googlecode.eyes-free.selfbraille.WriteData. This isn’t documented anywhere. Our summer intern Max Li did a terrific job dissecting the BrailleBack code and grabbing the relevant pieces from the GoogleCode project, and the result can be seen in Firefox for Android 25 and later. Max also added separate braille utterances, so the output isn’t as verbose as for speech, and follows better logic for braille readers. A few weeks ago, a review of using Android with braille was posted on Chris Hofstader’s blog by a deaf-blind user highlighting how well he could work with Firefox for Android in large part because of its excellent braille support. To reiterate: max was a summer intern at Mozilla last year. He is sighted and had never been in contact with braille before this as far as I know. He implemented this all by himself, occasionally asking me for feedback only. Imagine that, and then getting this review! I’m proud of you, Max!

And Max didn’t stop there: In Firefox 29 and above, an improvements to the way unordered and numbered lists are being presented in braille.

Much of all of this is good for Firefox OS, too

Because of the layered nature of our code, much of what has been implemented for Firefox for Android can be re-used in Firefox OS as well. The differences are mainly in what comes on top of the JavaScript/AccessFu layer. Talking to a synth directly instead of generating TalkBack events, of course using the new WebSpeech API, and in the future we’ll also “only” need to implement BrlTTY or something similar support and connectivity for braille displays. The abstraction allows us to then put the utterances to good use there as well. The main problem we’re having with Firefox OS right now is the actual UI written in HTML, JS, and CSS, code-named Gaia. Getting all the screens right so you cannot swipe to where you’re not supposed to at any given moment, making everything react to the proper activation events etc., and teaching the Gaia team a lot about accessibility along the way are the major tasks we’re working on for that right now. But thanks to the layering of the accessibility APIs and implementations, the transition from Firefox for Android was, though not trivial, not the biggest of our problems on Firefox OS. :)

In summary

The Android API documentation was not much help with all of this. As mentioned, the portion about SelfBrailleClient and friends wasn’t documented anywhere at all, at least I didn’t find anything but source-code references, among them that of Firefox for Android, in a Google search. But also the exact expectations of TalkBack aren’t retrievable just by studying the docs. Eitan had to dive into TalkBack’s source code to actually understand what it was expecting of us to deliver to make it all work.

Here’s to hoping that Google, despite its quest to close-source Android more and more, will keep BrailleBack and TalkBack open sourced so we can continue to read its source code in future updates to keep providing the good support our users have come to expect from us.

What are your thoughts?