Accessibility in Firefox for Android - Some more technical details

In my previous blog post, I focused on the user-facing aspects of the new accessibility features we’re currently building into Firefox for Android. This blog post is about the more technical details of this support.

First, the fact that we’ve come as far as we have in such a short time is thanks to the fact that, in November last year, the mobile team at Mozilla took the leap and moved Firefox away from a XUL-based user interface to one that is based on native Android widgets.

The first thing this gave us was the accessibility of the native browser UI for free. The only thing that needs to be taken care of is that all graphical buttons and other UI elements follow the Android accessibility guidelines. This includes Explore By Touch, which is a new feature of Ice Cream Sandwich. It simply works out of the box!

What this also gave us is better access from JavaScript to the Talkback feature of Android Accessibility, allowing us to generate so-called utterances easily, which are what TalkBack actually speaks.

So while the browser UI was already accessible for the most part, what we had to do was implement accessibility for the web content area. This is a custom view which is not accessible by default. Fortunately, our architecture allows us to easily interface with both the Java pieces of the Android framework as well as our internal APIs, which include accessibility.

To give blind users access to all web content, we had to implement a method of navigation that allows to go to not only focusable items such as links or form fields, but to any useful element on a page like a paragraph, a heading, a graphic that has interesting alternative text, and so forth. Because Firefox for Android does not have to provide that by default, this special mode of navigation is active only when accessibility is enabled. Our engine detects whether TalkBack is running or something else triggered accessibility within Android to be turned on, and we react by changing the navigational paradigm accordingly.

The magic behind the navigation is no magic at all really: When accessibility is enabled, our keyboard interceptor goes into a special mode where the module for accessibility, internally called AccessFu, is invoked and does the right thing for each navigational direction.

When the module receives a directional key press event, it makes a call into our internal accessibility APIs to navigate in the given direction. The current position is the basis for movement. The following rules apply:

  • If the directional controller is moved upwards, focus leaves the web content and is returned to the native browser UI. At that moment, handlers for the native UI take over and provide the focus movement logic.
  • If from that UI, a downward key is received, focus is returned back to the web content. If accessibility is on, we detect it, and AccessFu takes over again. The user is then returned to the last position known before leaving the web content. In essence, the position is saved, and the user does not have to start from the beginning of the document.
  • If a right or left directional event are received, a call is made into our internal APIs, called nsIAccessiblePivot, asking for the next or previous element to move to. Since we’re in accessibility mode, this is a screen-reader readable element. May be a paragraph, heading, graphic, or something which would otherwise be considered focusable, too, like a text field, other form field, button, or link.
  • If the user presses down on the directional controller, a click, jump, or other associated action is performed on the item. A link gets activated, a button clicked, a checkbox toggled etc.

nsIAccessiblePivot is a new interface introduced to our accessibility core APIs that performs the actual search for the next or previous element, based on the given traversal rule, and returns the object that is to be navigated to. This class walks the internal accessibility tree. It is therefore fast as hell and returns the result almost instantly.

The returned result is an accessible object and is again used by AccessFu to query it for name, role, states, action name etc. An utterance, which is nothing other than the phrase TalkBack should speak for this element, is put together and passed on to TalkBack for speaking. Again, AccessFu is written in JavaScript, and it communicates with C++, which is what the core accessibility APIs are written in on the one hand, and with the Java TalkBack/Android Accessibility interface on the other.

Just one more thing: What’s ticking at the very core of these accessibility APIs is the exact same engine that you are familiar with from the desktop. So if you are developing with accessibility in mind and use proper semantics in HTML, JavaScript and CSS, be assured that this will be interpreted by our engine for mobile in the same way as it is for the desktop. So all rules for semantically correct HTML apply in the same fashion as they do for the desktop: Provide alt texts for graphics, associate labels with form controls, use headings, use WAI-ARIA, etc., etc. This is also true if you develop web sites that will be displayed on iOS devices, by the way. Any accessibility engine on a mobile platform gains most from semantically correct HTML, so please use it!

So what else is to come? We have a number of plans still ahead that we want to realize. First and foremost, we want to get Explore By Touch working inside web content as well. Next up, convenience features that bring browsing onto a more efficient level. Our goal is to provide convenience features like heading or form field navigation, navigation by landmarks and other features you might be familiar with from desktop screen readers or VoiceOver on iOS. As we iron out the initial kinks of our accessibility support, you will see these appear in Fennec (code name for Firefox for Android) nightly builds over the next couple of weeks.

You might now think: “Hey wait, these sound a lot like screen reader features!” And guess what? You’re right! The way the Android accessibility APIs are designed, especially when it comes to custom views, you often don’t have a choice other than to implement at least partial screen reader capabilities yourself. From our standpoint from within the web content area, TalkBack is actually not much more than our bridge to the speech synthesizer. Outside the web content, however, TalkBack is more of a screen reader itself.

I would like to conclude this blog post with a big thank you to Eitan, who has designed the pivot interface and put all this clever architecture together in about half a year. Considering where he started from when Fennec was still fully XUL-based, I think it’s safe to say that we’ve come a long way since then. The whole accessibility team at Mozilla has been providing valuable input to this effort, but Eitan is the driving force behind this. The overwhelmingly positive feedback we received on my blog and on the Eyes-Free mailing list is rewarding and motivating to make accessible Firefox for Android even better, making it the most accessible browser on Android out of the box!

Show Comments