Some do’s and dont’s we gathered from making the Firefox OS UI more accessible

In my last post, I mentioned that we learned a few things from making the Firefox OS UI, code-named Gaia, more accessible over the past few months. This produced quite a few questions about what these things were. So rather than adding them to that blog post, here’s a dedicated one to just that topic.

First, a big thank you to Yura and Eitan for their input on this. It is mostly their patches that I learned these things from, I only played with some of the code myself and gave up when I was notified how much of the CSS broke by my code changes. πŸ™‚

Use buttons or links, not clickable divs and spans

Should be pretty obvious, but you would be surprised at how many clickable divs and spans there can be inside a single web app. Why that is, escapes me really. And despite what Google’s accessibility evangelists say in their videos, the rest of the accessibility community agrees with me that clickable spans and divs instead of buttons or links are not good practice!

Especially buttons are no longer hard to style to one’s liking nowadays. Moreover, buttons are focusable by default, they respond to the “disabled” attribute and are taken out of the focus order, and you get the onClick mechanism for free that usually responds to mouse, keyboard and touch. Moreover, if you just add the type=”button” attribute to a button element inside a form, it no longer is a submit button.

Second preferred thing should be links, however links don’t respond to the disabled attribute and may be harder to style. But there are places where they are definitely the appropriate thing to choose and should not be replaced by anything else.

Don’t use WAI-ARIA roles in CSS selectors

Although technically possible, this is extremely bad practice in our experience. First, WAI-ARIA markup has a very specific purpose. it is there to make HTML elements accessible to screen readers when these are used in a way they were not originally designed for. An example are tabs. HTML has no tabs widget, so compound HTML blocks must be used to simulate them. There is a very specific path to follow for these. If CSS selectors were now based on that markup, and that markup needs to change, suddenly a whole bunch of CSS needs to be touched to, or the layout will break.

In our case, WAI-ARIA roles had been inserted without actual knowledge of what they were meant for, and CSS selectors were based on that. Moreover, other parts of markup then used those CSS selectors by making simple rows of buttons inherit some of the tab styling, which made those buttons expose totally inappropriate semantics to screen readers.

So whatever you do, do not use WAI-ARIA roles or attributes as selectors for your CSS. Keep the visuals clearly separated from the screen reader-intended markup!

Hide inactive content properly

If your app is made in a way where every screen gets its own HTML page, you’re out of the woods. If, on the other hand, several screens are put within one HTML file, you need to absolutely make sure that content is hidden properly. Like the sighted user, you want to trap the screen reader user on one particular touch screen at a time. All touch-screen-supporting screen readers support a sequential method of going through available items on a screen. If you don’t hide the inactive/invisible parts of your app from the screen reader user, they are now suddenly able to move to parts you don’t intend them to go. They might even try to activate items in parts that are currently not appropriate, potentially breaking your app’s logic. We found three methods to work best when hiding content:

  1. Use styling like display: none; or visibility: hidden; for those parts you don’t want shown. That is the most reliable method to hide content, since screen readers don’t ever expose such content. There is a more thorough explanation of the mechanisms and things to consider is described in another post on this blog.
  2. If the above does not work for you for some reason, because you may encounter performance or reflow issues, resort to using aria-hidden. Hide everything except the active screen.

Use common accessibility practices

Most accessibility practices you’ve learned for desktops also apply for mobile. Use alt text, or if that doesn’t work out for some reason, aria-label for images, form labeling techniques as usual, and make sure you have a logical order of your HTML, because on mobile, too, screen readers infer the reading order from the order of the elements in the document object model.

As if on queue, the BBC accessibility team have published their mobile accessibility guidelines today. And although tailored towards the BBC primarily, all of the techniques and examples found there are valid for any mobile web app and its accessibility. I strongly recommend you check them out! If I was your teacher, I’d say: Mandatory read! πŸ™‚

Accessibility in Firefox OS: An update

After my introductory video, quite some things happened in the realm of Firefox OS accessibility, and although we’re still not quite ready yet to release builds to beta testers, we’re getting closer to a state where this will be the case.

In the interim, I’d like to share a few things that have happened over the past few months that may get you a bit excited! We certainly are! πŸ˜‰

A screen reader toggle

It recently became possible to toggle the screen reader and thus gain access to a Firefox OS device running the Feb 12 or later nightly builds of Firefox OS 1.4/Master. The way this works is as follows:

  1. Press the Power button three times, with close to one second pause between each press. So unlike iOS, which some of my readers may be familiar with, here it’s slowness that matters. The reason is that currently, the Power button doesn’t register properly when you press it too fast in rapid succession.
  2. You’ll then hear an announcement instructing you to press the Power button three more times in the same manner to toggle screen reader on.
  3. Do so, and you’re ready to go.

This even works in the Welcome wizard/First Time User experience screen. In essence, I can now setup a Firefox OS phone as a blind user.

Turning the screen reader off works in a similar fashion: Press the Power button three times in slow succession, confirm that you want to end it, and off it is switched.

Speech property controls

Also, recently, speech controls were added to an Accessibility panel in the Settings app. You can control the screen reader on/off state itself, the speech rate and the volume.

The Accessibility panel is enabled when the screen reader is turned on, or it can be enabled from Developer settings, which in turn can be enabled from Device Info/More information. This, of course, is only interesting for sighted folks interested in trying things out, a blind user will most likely use the toggle. The important thing to note is that the screen reader setting has been moved out from the developer settings into the Accessibility panel in 1.4. So the occurrences that we had over the past couple of weeks where people turned on the screen reader accidentally and then didn’t know how to work their devices, should occur less. πŸ˜‰

Automatic focusing on new elements

When switching pages, bringing up an application or a website, the first element is now automatically focused after load. This is something Firefox for Android 30 and later will also do when a new web page loads. With that, speech response is much moe intuitive.

Accessible Settings

The Settings app in Firefox OS is the first to become fully accessible. It was the ground work laying app for us and turned out pretty well. We learned a lot about how to style, or not style, stuff, how to properly use techniques to only have items to swipe to that are actually relevant, and other details.

Problems, problems, problems

Oh yes, there are still a lot of problems, primarily in the user interface itself. The whole user interface of Firefox OS is written in HTML5, JavaScript and CSS, and when we started making it accessible, some of the code was pretty rough in terms of semantics. But Eitan and Yura sank their teeth into it, and despite several refactorings and such, we’re getting to a point where important work is being done that will make a lot of apps at once much more accessible. Along the way, we’re setting some rules for everyone on the Gaia team to follow, and implementing tests to make sure breakage occurs less often in the future. Some of these rules include things like “don’t use WAI-ARIA roles as parts of CSS selectors”, for example.

Everyone who has made a so far mostly inaccessible web app accessible knows how much tedious work this can be. And even more so with a dynamic project like Gaia, the Firefox OS user interface. But we’re getting a lot better, and seeing other team members pick up knowledge and attention on proper accessibility semantics. πŸ™‚ Gaia’s code is also improving a lot over-all, and that definitely helps, too!

In summary, we’re working hard, with me testing and other team members coding their butts off to give everyone the best possible user experience in Firefox OS in the near future!Β Stay tuned for more updates!

Accessibility in Firefox for Android: Some more technical background, Part II

A long while back, I wrote a post explaining some of the more technical details of the implementation of the accessibility in Firefox for Android. If you want to read the whole post, feel free to do so and then come back here, but for those of you who don’t, here is a short recap of the most important points:

  1. What made the accessibility possible at all in the first place was the fact that Firefox for Android started to have a native Android UI instead of a custom XUL one.
  2. The only thing that needed to be made accessible was the custom web view we’re using, all the rest of the browser UI gained accessibility from using native Android widgets.
  3. The switch to a native UI also gave us the possibility to talk directly to TalkBack and other assistive technology apps.
  4. At the core is the well-known accessibility API also used on the desktop, written in C++. On top of that sits a JavaScript layer, code-named AccessFu, which pulls information from that layer and generates TalkBack events to make everything speak. It also receives the keyboard commands from the Android side, and as we’ll see below, has been substantially extended to also include touch gestures.
  5. There is now also an extended layer of accessibility code on the native Android layer, which I’ll come to below.

Making Explore By Touch work

After that last post, we had to make Explore By Touch work, for both Ice Cream Sandwich and, as it was released shortly after, the all-new Jelly Bean and beyond accessibility enhancements. For that, the JavaScript layer receives touch events from the Android side of things and queries the core engine for the element at the touch point. Also gestures like tapping, swiping, double-tapping and two-finger scrolling are received and processed accordingly.

For Jelly Bean and beyond, we had to do a special trick to make left and right swipes work. Because we’re implementing everything ourselves, we have to fake previous and next accessible elements, making TalkBack and others believe they have true native accessible elements to the left and right of the current element. TalkBack, in effect, only knows about three accessibles on the page. The currently spoken one is always the one in the middle of these three. When the user swipes right to go to the next element, the element to the right becomes the new middle one, gets spoken, and at the same time, the previous middle one becomes the one to the left, and the next regular page element is fetched and becomes the new right element. This particular logic sits in our Android accessibility code and queries the JavaScript portion for the relevant accessibles.

The fact that we have so much control over this stuff, in fact we have to do it this way or it wouldn’t work, allowed us to port the swiping back to Ice Cream Sandwich AKA Android 4.0, which doesn’t natively support that, and even Gingerbread AKA Android 2.3, which doesn’t have Explore By Touch support at all. But in the Firefox web views, it works! Including swiping and double-tapping, two-finger scrolling etc. Unfortunately, there was no way to make the rest of the browser UI accessible by touch on Gingerbread, too, so you’ll still have touse the keyboard for that.

On Jelly Bean and above, we are restricted a bit by what gestures Android supports. In effect, we cannot change the swiping, dluble-tapping, exploring, or two-finger scrolling behavior. We also cannot change what happens when the user swipes up and down. In those instances, Android expects a certain behavior, and we have to provide it. This is why, despite popular request, we currently cannot change the 3 finger swipes left and right to something more comfortable to execute quick navigation. Single-finger swipes left and right are strictly reserved for single object navigation. We’d love it to be different, but we’re bound in this case.

BrailleBack support

Some of the above techniques were used, in a slightly different fashion, to also implement BrailleBack support. As for TalkBack and friends, we have to implement everything ourselves. You have to implement two protocols: com.googlecode.eyes-free.selfbraille.SelfBrailleClient and com.googlecode.eyes-free.selfbraille.WriteData. This isn’t documented anywhere. Our summer intern Max Li did a terrific job dissecting the BrailleBack code and grabbing the relevant pieces from the GoogleCode project, and the result can be seen in Firefox for Android 25 and later. Max also added separate braille utterances, so the output isn’t as verbose as for speech, and follows better logic for braille readers. A few weeks ago, a review of using Android with braille was posted on Chris Hofstader’s blog by a deaf-blind user highlighting how well he could work with Firefox for Android in large part because of its excellent braille support. To reiterate: max was a summer intern at Mozilla last year. He is sighted and had never been in contact with braille before this as far as I know. He implemented this all by himself, occasionally asking me for feedback only. Imagine that, and then getting this review! I’m proud of you, Max!

And Max didn’t stop there: In Firefox 29 and above, an improvements to the way unordered and numbered lists are being presented in braille.

Much of all of this is good for Firefox OS, too

Because of the layered nature of our code, much of what has been implemented for Firefox for Android can be re-used in Firefox OS as well. The differences are mainly in what comes on top of the JavaScript/AccessFu layer. Talking to a synth directly instead of generating TalkBack events, of course using the new WebSpeech API, and in the future we’ll also “only” need to implement BrlTTY or something similar support and connectivity for braille displays. The abstraction allows us to then put the utterances to good use there as well. The main problem we’re having with Firefox OS right now is the actual UI written in HTML, JS, and CSS, code-named Gaia. Getting all the screens right so you cannot swipe to where you’re not supposed to at any given moment, making everything react to the proper activation events etc., and teaching the Gaia team a lot about accessibility along the way are the major tasks we’re working on for that right now. But thanks to the layering of the accessibility APIs and implementations, the transition from Firefox for Android was, though not trivial, not the biggest of our problems on Firefox OS. πŸ™‚

In summary

The Android API documentation was not much help with all of this. As mentioned, the portion about SelfBrailleClient and friends wasn’t documented anywhere at all, at least I didn’t find anything but source-code references, among them that of Firefox for Android, in a Google search. But also the exact expectations of TalkBack aren’t retrievable just by studying the docs. Eitan had to dive into TalkBack’s source code to actually understand what it was expecting of us to deliver to make it all work.

Here’s to hoping that Google, despite its quest to close-source Android more and more, will keep BrailleBack and TalkBack open sourced so we can continue to read its source code in future updates to keep providing the good support our users have come to expect from us.