An update on office solutions in the browser and on mobile

Regular readers of my blog may remember my January 2014 shout out to Microsoft for implementing great accessibility in their Office Online offering. Later in the year, I also gave an overview over the accessibility in Google apps. Now, in late April 2015, it is time for an update, since both have made progress. We will also take a look at what has changed in Apple’s iCloud on the web suite, and I’ll introduce an open-source alternative that is ramping up to becoming more accessible.

Google apps

The Google apps suite for both free Gmail and paid Google Apps for Business accounts has grown enormously in functionality. And so have the accessibility features. No matter which are you look at, be it Docs with its wide variety of document navigation and collaboration features, Sheets with its ever more comprehensive spreadsheet editing and navigation abilities, or Slides, which allows a blind person to create full-featured slide shows. Gmail itself and Drive are also running very smoothly nowadays. Creating a Google Form to conduct surveys is also going very strongly.

One of the most remarkable facts about this is the enhancements to documentation the Google apps have received. Docs now has dedicated help pages for navigation, formatting, collaboration, and a huge list of keyboard shortcuts for all supported platforms. Take alone the Collaborating on a document with a screen reader, and just try a few things described in there with a friend, co-worker or family member. Each time I use these features, I am totally blown away by the experience.

Docs also introduced braille support and has by now expanded this to Firefox and screen readers as well. If you use it, you’ll get braille output (of course), but may lose some announcements that are available when braille support is not enabled. I have found that a combination of both modes works well for me: Non-braille mode for editing and collaboration, and braille mode for the final proof-reading.

The iOS apps have also made huge leaps forward. If you use an external keyboard with your iPhone or iPad, you have a similarly rich set of key strokes available to you that you have on the web. Compared to where these apps were a year ago, … Uh … forget it, there is no comparison. It’s like day and night!

On Android, the situation looks good as well, within, of course, the limitations that TalkBack still imposes on the users in general. Future versions may vastly improve this situation, let’s keep our fingers crossed! Until then, I suggest you look at the help documentation for Docs with Android.


Microsoft has also enhanced its accessibility features. Word Online, Excel Online, and PowerPoint Online work even better than when I wrote my first article. I found that the collaboration features don’t work as smoothly for me as they do with Google Docs, but because of OneDrive and Dropbox integrations, many tasks can be accomplished using the Office for Windows suite with all its features if the browser version falls short. The start page for accessibility in Office Online gives good pointers to articles with further information.

I also found that the web mailer behaves more like a web page than a real application in many instances. But of course, it has tight integration with Microsoft Outlook and Windows Mail in Windows, so again, if the web version falls short for you if you use these services, you can use the desktop versions.

The iOS versions also have seen great improveents for VoiceOver. The new kid on the block, Outlook for iOS, is getting frequent updates which usually also contain VoiceOver fixes.

And some good news for all the Mac enthusiasts out there: The Microsoft Office 2016 for Mac preview received an update on April 14, 2015 which, according to this support article, also contains VoiceOver improvements for Word, Excel, and PowerPoint. Outlook is also said to be accessible via a different support article on this.

I can’t say much about the Android versions of the Microsoft Office applications, and the official documentation channels haven’t revealed anything. If you have any experience, please let me know in the comments! Especially the MS Word for Android Tablet, and friends, are the interesting ones I think, since they are more feature-rich than the Office for Android phone app.


As much as Apple is great when it comes to accessibility in their hardware devices, including the latest new device category Apple Watch, as dismal is the situation with their offering. This thing just doesn’t have the fit and finish that the other products have. Yes, many buttons are now labeled, and yes, in Pages on the web, lines are read when you navigate them, as well as some other information. But the overall experience is not good. The keyboard focus gets lost every so often, and unpredictably, the interface is confusing and keeps stuff around that might, in a certain situation, not even be activable. This is nothing I can recommend to any screen reader user to use productively, even after some upgrades it received over the past year.

If you want, or have to, use iWork for iCloud, use the Mac or iOS versions. They work quite OK and get the job done.


And here’s the new kid on the block that I promised you! It’s called Open-Xchange App Suite, and is actually not quite as new in the field of collaborative tools. But its accessibility efforts are fairly new, and they look promising. Open-Xchange is mostly found in web mail providers such as the Germany-based or 1&1, but also used internationally. Furthermore, anyone who runs certain types of Linux distributions as a server can run their own instance, with mail and cloud storage services. It also offers standards like IMAP and SMTP, CalDav for calendar sync, CardDav for contact sync, and WebDav for access to the files. It works in the MS Office formats, so is compatible with most, if not all, other office suites.

Its accessibility features on the web are on their way to becoming really good. They’ve still got some ways to go, primarily also in the way the keyboard focus handling works and how to get some tasks done really efficiently, but Mail, some parts of Calendar, Contacts, Drive, Text and the dashboard really work quite well already. It is nothing that compares yet to what Google is offering, but it comes close to the quality of what Microsoft is offering on the web, and definitely surpasses that in some areas.

This is definitely something to keep an eye on. I certainly will be watching its progress closely.

In summary

As of this writing, Google definitely comes out strongest in terms of accessibility and fit and finish when it comes to working efficiently with their apps. Granted, it takes some getting used to, and it requires that a screen reader user know their assistive technology and are willing to learn some keyboard shortcuts and familiarize themselves with certain usability concepts. But once that is mastered, Google Apps is definitely something I can whole-heartedly recommend for online collaboration. Furthermore, if you look at new features for commercial screen readers such as JAWS, you can see that they’re paying attention and improve their support for Google apps bit by bit where support is still lacking.

Microsoft is close behind, with some areas that are better accomplished in their desktop apps or on tablets rather than on the web.

Open-Xchange still has its bumps in the road, but is on a good way to becoming a viable alternative for those who can rely on their own infrastructure and want to go to a full open-source stack.

And for Apple, I recommend staying away from the web offering and doing all the office work in iWork apps for Mac or iOS. The web stuff is just too much of a hassle still.

Apple are losing their edge also in accessibility quality

This post was originally published in January of 2015, and has last been updated on April 10, 2015, with latest information on the mentioned problems in light of the OS X 10.10.3 and iOS 8.3 releases from April 8, 2015.

Over the past couple of days, a number of well-known members in the Apple community raised their voices in concern about Apple’s general decline in software quality. Marco Arment (former “Mr. Instapaper” and now “Mr. Overcast”) started out by saying that Apple has lost the functional high ground. John Gruber of Daring Fireball slightly disagrees, but says that Apple have created a perception that “Other people’s stuff doesn’t work, but Apple’s stuff doesn’t work, either”. And finally, Dr. Drang looks at the power of leverage in this context. And now, well, here is my take on the subject.

Some long-standing readers of this blog may recall this post I wrote in June of 2009 about my first experience using an iPhone. It was the first time I interacted with a touch screen device that was accessible to me as a blind user.

For several years to come, Apple would lead in terms of including accessibility features into both its mobile and desktop operating systems. Zoom had already been there when VoiceOver was introduced in iOS 3.0, and what followed were features for people with varying disabilities and special needs. Assistive Touch, which allows gestures to be performed differently, mono audio and integration with hearing aids, sub titling, audio description and other media accessibility enhancements, Guided Access for people with attention deficiencies, Siri, and most recently, Braille input directly on the touch screen in various languages and grades. Especially on iOS, VoiceOver and the other accessibility features received updates every year with every major release, and new features were added.

In the beginning, especially in Snow Leopard and Lion, Apple also did the same for OS X. It gradually also added many of the features it had added to iOS to OS X to keep them in sync. But ever since Mountain Lion, VoiceOver did not see much improvement any more. In fact, the lack of newly introduced features could lead one to the perception that Apple thinks that VoiceOver is done, and no new features need to be added.

But, and I haven’t said this for the first time on this blog, the quality of existing features is steadily declining, too. In fact, with the release of both OS X 10.10 “Yosemite” and iOS 8, the quality of many accessibility features has reached a new all-time low. AppleVis has a great summary of current problems in iOS 8. But let me give you two examples.

The first problem was so obvious and easily reproducible that it is hard to imagine Apple’s quality assurance engineers didn’t catch this, and that was on the iPhone in Safari, when going back from one page to the previous one with the Back button. When VoiceOver was running, I hadn’t found a single page where this simple action did not trigger a freeze in Safari and VoiceOver. This was in early betas of iOS 8, and it was only fixed in the 8.3 release released in April of 2015, roughly 10 months after the first iOS 8 beta was seeded to developers on the day of WWDC 2014.

The second example concerned using Safari (again) with VoiceOver, but this time on the iPad. Using Safari itself, or any application that uses one of the two WebView components, I was reliably able to trigger a full restart of the iPad at least twice a day, most days even more often. That caused all apps to quit, sometimes without being able to save their stuff, it interrupted work, and it left the iPad in a semi-unstable state that it was better to fully shut it down and restart it fresh. Update: This, too, was finally fixed in iOS 8.3. But it took quite a number of logs that I sent to Apple engineers before the problem could be fixed. But after I wrote the initial version of this post, in a concerted effort, this could finally be nailed down.

“Wait”, you might say, “this sounds like a problem from iOS 7 days, and wasn’t it fixed?” Yes, I would reply, it was, but it returned in full force in iOS 8. But mostly on the iPad. I think I’ve only seen one or two restarts on my iPhone since iOS 8 came out.

The first of these two examples is such low-hanging fruit that I, if I was working at Apple, would be deeply ashamed that this was around for so long. The second one was harder, but not so hard that an engineer sitting down for a day and using the device with VoiceOver enabled wouldn’t run into it.

But accessibility problems are not limited to VoiceOver alone. Web Axe posted a great summary about problems in other areas such as the default settings for paralax effects, button shapes and more. Go check it out for more information on those topics!

And now back to Yosemite. I again concentrate on Safari + VoiceOver, since this is where I spend a lot of my time. Support had regressed so badly in the initial 10.10 release and the first 10.10.1 update especially on dynamic pages that it was barely possible to use Facebook on Yosemite with VoiceOver. VoiceOver skipped over whole stories, lost focus, and did all sorts of other funky stuff. It took until the update to 10.10.2 on January 27, 2015, 3 months after the initial release, to at least largely address these problems. Moreover, editing in any form field on the web was so slow and double-spoke that it was not really possible to do productive work there. And if you had a braille display connected, you could expect it to drop out every few seconds when moving the cursor. The sounds VoiceOver made were the equivalent of plugging and unplugging a USB braille display every 3 to 4 seconds. These also were corrected in the 10.10.2 release. In fact this update is written in the MarsEdit blog authoring software, and using that hadn’t been possible with VoiceOver until 10.10.2.

All of these problems have been reported to Apple, some by multiple users. They were tweeted about publicly, and now I am reiterating over them again to show my support for Marco, John, and others who assert rightly that Apple has a real quality problem on their hands, which higher management seems to be quite thick-skinned about. Blinded by their own brilliant marketing or something? 😉

Apple does have a fantastic accessibility story. No other operating system I know has so many features for such a big variety of people built-in (speaking mostly for iOS now). But they’re on the verge of badly trumping that trust many people with disabilities put in them by delivering such poor quality updates that make it virtually impossible to take advantage of these features in full force. Especially when such basic functionality as I describe in Safari, and AppleVis summarize on their blog, are getting in the way of use every minute of every day now. And Apple really need to be careful that others may catch up sooner rather than later. On the web, the most robust accessibility is already being delivered by a different desktop browser/screen reader combination on a different operating system. As for mobile: Android is the lesser of the competition, even in its latest update, in my opinion. But Microsoft’s foundation is really solid in Windows Phone 8.1. They just need to execute on it much better, and they could really kick ass and become a viable alternative to Apple on mobile.

Fortunately, with the release of iOS 8.3, Apple finally came to their senses and included a big number of accessibility fixes and polish, which AppleVis summarizes in this blog post. Here’s to hoping that the rumors come true that Apple will be focusing a lot more on stability and polish in the iOS 9 update, expected to be in beta at around their developer conference later this year. I think the fact that iOS now has a public beta program, too, is a good sign for that, and it gives more people the opportunity to test and report problems early.

So here is my appeal to Tim Cook, CEO of Apple: Put action behind these words again! Go to these extraordinary lengths you speak of by not just cranking out new features that are half-baked, but make sure your engineers work on the over-all quality in a way that does not make your most faithful users feel like they’re being let down by your company! Because that is, exactly, how it feels. This trend started more strongly in iOS 7, and even worsened in iOS 8. And it has been with OS X even longer, starting in Mountain Lion and worsened ever since. Please, show us, our friends and family who started using your products because of our recommendations, and those of us who took a leap of faith early on and put our trust, our productivity, our daily lives in your products, that these are not just empty words and that that award you received actually means something beyond the minutes you gave that speech!


a caring, but deeply concerned, user

Switching back to Windows

This post is no longer current. For an update on the reasons why, read this post. The below post is kept for historic reasons, since they reflect my thoughts and reasoning from that time.

Yes, you read correctly! After five years on a Mac as my private machine, I am switching back to a Windows machine in a week or so, depending on when Lenovo’s shipment arrives.

You are probably asking yourself, why I am switching back. In this post, I’ll try to give some answers to that question, explain my very personal views on the matters that prompted this switch, and give you a bit of an insight into how I work and why OS X and VoiceOver no longer really fit that bill for me.

A bit of history

When I started playing with a Mac in 2008, I immediately realised the potential this approach that Apple was taking had. Bundling a screen reader with the operating system had been done before, on the GNOME desktop, for example, but Apple’s advantage is that they control the hardware and software back to front and always know what’s inside their boxes. So a blind user is always guaranteed to get a talking Mac when they buy one.

On Windows and Linux, the problem is that the hardware used is unknown to the operating system. On pre-installed systems, this is usually being taken care of, but on custom-built machines with standard OEM versions of Windows or your Linux distro downloaded from the web, things are different. There may be this shiny new sound card that just came out, which your dealer put in the box, but which neither operating system knows about, because there are no drivers. And gone is the dream of a talking installation! So, even when Windows 8 now allows Narrator to be turned on very early in the installation process in multiple languages even, and Orca can be activated early in a GNOME installation, this all is of no use if the sound card cannot be detected and the speech synthesizer canot output its data through conected speakers.

And VoiceOver had quite some features already when I tried it in OS X 10.5 Leopard: It had web support, e-mail was working, braille displays, too, the Calendar was one of the most accessible on any desktop computer I had ever seen, including Outlook’s calendar with the various screen readers on Windows, one of which I had even worked on myself in earlier years, and some third-party apps were working, too. In fact, my very first Twitter client ran on the Mac, and it was mainstream.

There was a bit of a learning curve, though. VoiceOver’s model of interacting with things is quite different from what one might be used to on Windows at times. Especially interacting with container items such as tables, text areas, a web page and other high-level elements can be confusing at first. If you are not familiar with VoiceOver, interacting means zooming into an element. A table suddenly gets rows and columns, a table row gets multiple cells, and each cell gets details of the contained text when interacting with each of these items consecutively.

In 2009, Apple advanced things even further when they published Snow Leopard (OS X 10.6). VoiceOver now had support for the trackpads of modern MacBooks, and when the Magic TrackPad came out later, it also just worked. The Item Chooser, VoiceOver’s equivalent of a list of links or headings, included more items to list by, and there was now support for so-called web spots, both user-defined and automatic. A feature VoiceOver calls Commanders allowed the assignment of commands to various types of keystrokes, gestures, and others. If you remember: Snow Leopard cost 29 us Dollars, and aside from a ton of new features in VoiceOver, it obviously brought all the great new features that Snow Leopard had in store for everyone. A common saying was: Other screen readers needed 3 versions for this many features and would have charged several hundred dollars of update fees. And it was a correct assessment!

In 2011, OS X 10.7 Lion came out, bringing a ton of features for international users. Voices well-known from iOS were also made available in desktop formats for over 40 languages, international braille tables were added, and it was no longer required to purchase international voices separately from vendors such as AssistiveWare.. This meant that users in more countries could just try out VoiceOver on any Mac in an Apple retail store or a reseller’s place. There were more features such as support for WAI-ARIA landmarks on the web, activities, which are either application or situation-specific sets of VoiceOver settings, and better support for the Calendar, which got a redesign in this update.

First signs of trouble

But this was also the time when first signs of problems came up. Some things just felt unfinished. For example: The international braille support included grade 2 for several languages, including my mother tongue German. German grade 2 has a thing where by default, nothing is capitalized. German capitalizes many more words than English, for example, and it was agreed a long time ago that only special abbreviations and expressions should be capitalized. Only in learning material, general orthographic capitalization rules should be used. In other screen readers, capitalization can be turned on or off for German and other language grade 2 (or even grade 1). Not so in VoiceOver for both OS X and iOS. One is forced to use capitalization. This makes reading quite awkward. And yes, makes, because this remains an issue in both products to this date. I even entered a bug into Apple’s bug tracker for this, but it was shelved at some point without me being notified.

Some other problems with braille also started to surface. For some inexplicable reason, I often have to press routing buttons twice until the cursor appears at the spot I want it to when editing documents. While you can edit braille verbosity where you can define what pieces of information are being shown for a given control type, you cannot edit what gets displayed as the control type text. A “closed disclosure triangle” always gets shown as such, same as an opened one. On a 14 cell display, this takes two full-length displays, on a 40 cell one, it wastes most of the real estate and barely leaves room for other things.

Other problems also gave a feeling of unfinished business. The WAI-ARIA landmark announcement, working so well on iOS, was very cumbersome to listen to on OS X. The Vocalizer voices used for international versions had a chipmunk effect that was never corrected and, while funny at first, turned out to be very annoying in day-to-day use.

OK, the enthusiastic Mac fan boy that I was, thought, let’s report these issues and also wait for the updates to trickle in. None of the 10.7 updates really fixed the issues I was having.

Then a year later, Mountain Lion, AKA OS X 10.8, came out, bringing a few more features, but compared to the versions before, much much less. Granted, it was only a year between these two releases, whereas the two cycles before had been two years each, but the features that did come in weren’t too exciting. There was a bit polish here and there with drag and drop, one could now sort the columns of a table, and press and hold buttons, and a few little things more. Safari learned a lot new HTML5 and more WAI-ARIA and was less busy, but that was about it. Oh yes and one could now access all items in the upper right corner of the screen. But again, not many of the previously reported problems were solved, except for the chipmunk effect.

There were also signs of real problems. I have a Handy Tech Evolution braille display as a desktop braille display, and that had serious problems from one Mountain Lion update to the next, making it unusable with the software. It took two or three updates, distributed over four or five months, before that was solved, basically turning the display into a useless piece of space-waster.

And so it went on

And 10.9 AKA Mavericks again only brought a bit polish, but also introduced some serious new bugs. My Handy Tech BrailleStar 40, a laptop braille display, is no longer working at all. It simply isn’t being recognized when plugged into the USB port. Handy Tech are aware of the problem, so I read, but since Apple is in control of the Mac braille display drivers, who knows when a fix will come, if at all in a 10.9 update. And again, old bugs have not been fixed. And new ones have been introduced, too.

Mail, for example, is quite cumbersome in conversation view now. While 10.7 and 10.8 very at least consistent in displaying multiple items in a table-like structure, 10.9 simply puts the whole mail in as an embedded character you have to interact with to get at the details. It also never keeps its place, always jumping to the left-most item, the newest message in the thread.

The Calendar has taken quite a turn for the worse, being much more cumbersome to use than in previous versions. The Calendar UI seems to be a subject of constant change anyway, according to comments from sighted people, and although it is technically accessible, it is no longer really usable, because there are so many layers and sometimes unpredictable focus jumps and interaction oddities.

However, having said that, an accessible calendar is one thing I am truly going to miss when I switch back to Windows. I know various screen readers take stabs at making the Outlook calendar accessible, and it gets broken very frequently, too. At least the one on OS X is accessible. I will primarily be doing calendaring from my iOS devices in the future. There, I have full control over things in a hassle-free manner.

iBooks, a new addition to the product, is a total accessibility disaster with almost all buttons unlabeled, and the interface being slow as anything. Even the update issued shortly after the initial Mavericks release didn’t solve any of those problems, and neither did the 10.9.1 update that came out a few days before Christmas 2013.

From what I hear, Activities seem to be pretty broken in this release, too. I don’t use them myself, but heard that a friend’s activities all stopped working, triggers didn’t fire, and even setting them up fresh didn’t help.

Here comes the meat

And here is the first of my reasons why I am switching back to Windows: All of the above simply added up to a point where I lost confidence in Apple still being dedicated to VoiceOver on the Mac as they were a few years ago. Old bugs aren’t being fixed, new ones introduced and, despite the beta testers, which I was one of, reporting them, were often not addressed (like the Mail and Calendar problems, or iBooks). Oh yes, Pages, after four years, finally became more accessible recently, Keynote can now run presentations with VoiceOver, but these points still don’t negate the fact that VoiceOver itself is not receiving the attention any more that it would need to as an integrated part of the operating system.

The next point is one that has already been debated quite passionately on various forums and blogs in the past: VoiceOver is much less efficient when browsing the web than screen readers on Windows are. Going from element to element is not really snappy, jumping to headings or form fields often has a delay, depending on the size and complexity of a page, and the way Apple chose to design their modes requires too much thinking on the user’s part. There is single letter quick navigation, but you have to turn on quick navigation with the cursor keys first, and enable the one letter quick navigation separately once in the VoiceOver utility. When cursor key quick navigation is on, you only navigate via left and right arrow keys sequentially, not top to bottom as web content, which is still document-based for the most part, would suggest. The last used quick navigation key also influences the item chooser menu. So if I moved to a form field last via quick navigation, but then want to choose a link from the item chooser, the item chooser opens to the form fields first. I have to left arrow to get to the links. Same with headings. For me, that is a real slow-down.

Also, VoiceOver is not good at keeping its place within a web page. As with all elements, once interaction stops, then starts again, VoiceOver starts interaction at the very first element. Conversations in Adium or Skype, and even the Messages app supplied by Apple, all suffer from this. One cannot jump into and out of the HTML area without losing one’s place. Virtual cursors on Windows in various screen readers are very good at remembering the spot they were at when focus left the area. And even Apple’s VoiceOver keystroke to jump to related elements, which is supposed to jump between the input and HTML area in such conversation windows, is a subject of constant breakage, re-fixing, and other unpredictability. It does not even work right in Apple’s own Messages app in most cases.

Over-all, there are lots of other little things when browsing the web which add up to make me feel I am much less productive when browsing the web on a Mac than I am on Windows.

Next is VoiceOver’s paradigm of having to interact with many elements. One item where this also comes into play is text. If I want to read something in detail, be it on the web, a file name, or basically anything, I have to interact with the element, or elements, before I get to the text level, read word by word or character by character, and then stop interaction as many times as I started it to get back to where I was before wanting to read in detail. Oh yes, there are commands to read and spell by character, word, and sentence, but because VoiceOver uses the Control+Option keys as its modifiers, and the letters for those actions are all located on the left-hand side of the keyboard, it means I have to take my right hand off its usual position to press these keys while the left hand holds the Control and Option keys. MacBooks as well as the Apple Wireless Keyboard don’t have Control and Option keys on both sides, and my hand cannot be bent in a fashion that I can grab these keys all with one hand. Turning on and off the VoiceOver key lock mechanism for this would add even more cumbersome to the situation.

And this paradigm of interaction is also applied to the exploration of screen content by TrackPad. You have to interact or stop interacting with items constantly to get a feel for the whole screen. And even then, I often feel I never get a complete picture. Unlike on iOS, where I always have a full view of a screen. Granted, a desktop screen displays far more information than could possibly fit on a TrackPad without being useless millimeter touch targets, but still the hassle of interaction led to me not using the TrackPad at all except for some very seldom specific use cases. We’re talking about a handful instances per year.

Next problem I am seeing quite often is the interaction braille gives me. In some cases, the output is just a dump of what speech is telling me. In other cases, it is a semi-spacial representation of the screen content. In yet another instance, it may be a label with some chopped off text to the right, or to the left, with the cursor not always in predictable positions. I already mentioned the useless grade 2 in German, and the fact that I often have to press the routing button at least twice before the cursor gets where I want it to go. The braille implementation in VoiceOver gives a very inconsistent impression, and feels unfinished, or done by someone who is not a braille reader and doesn’t really know the braille reader’s needs.

Next problem: Word processing. Oh yes, Pages can do tables in documents now, and other stuff also became more accessible, but again because of the paradigms VoiceOver uses, getting actual work done is far more cumbersome than on Windows. One has, for example, to remember to decouple the VoiceOver cursor from the actual system focus and leave that inside the document area when one wants to execute something on a tool bar. Otherwise, focus shifts, too, and a selection one may have made gets removed, rendering the whole endeavor pointless. Oh yes, and one has to turn the coupling back on later, or habits will result in unpredictable results because the system focus didn’t move where one would have expected it to. And again, VoiceOver’s horizontally centered navigation paradigm. Pages of a document in either Pages or Nisus Writer Pro appear side by side, when they are visually probably appearing below one another. Each page is its own container element. All of this leaves me with the impression that I don’t have as much control over my word processing as I have in MS Word or even the current snapshot builds of OpenOffice or LibreOffice on Windows. I also get much more information that I don’t have to look for explicitly, for example the number of the current page. NVDA, but probably others, too, have multilingual document support in Word. I  immediately hear which spell checking is being used in a particular paragraph or even sentence.

There are some more issues which were not addressed to this day. There is no PDF reader I know of on OS X that can deal with tagged (accessible) PDFs. Even when tags are present, Preview doesn’t do anything with them, giving the more or less accurate text extraction that one gets from untagged PDFs. As a result, there is no heading navigation, no table semantic output, and more that accessible PDFs support.

And the fact that there is no accessible Flash plug-in for web browsers on OS X also has caused me to switch to a Windows VM quite often just to be able to view videos embedded in blogs or articles. Oh yeah, HTML5 video is slowly coming into more sites, but the reality is that Flash is probably still going to be there for a couple of years. This is not Apple’s fault, the blame here is solely to be put on Adobe for not providing an accessible Flash plug-in, but it is one more thing that adds to me not being as productive on a Mac as I want to be on a desktop computer.


In summary: By all of the above, I do not mean to say that Apple did a bad job with VoiceOver to begin with. On the contrary: Especially with iOS, they have done an incredibly good job for accessibility in the past few years. And the fact that you can nowadays buy a Mac and install and configure it fully in your language is highly commendable, too! I will definitely miss the ability to configure my system alone, without sighted assistance, should I need to reinstall Windows. As I said above, that is still not fully possible without assistance. It is just the adding up of things that I found over the years that caused me to realize that some of the design decisions Apple has made for OS X, bugs that were not addressed or things get broken and not fixed, and the fact that apps are either accessible or they aren’t, and there’s hardly any in-between, are not compatible with my way of working with a desktop computer or laptop in the longer term. For iOS, I have a feeling Apple are still full-steam ahead with accessibility, introducing great new features with each release, and hopefully also fixing braille problems as discussed by Jonathan Mosen in this great blog post. For OS X, I am no longer as convinced their heart is in it. As I have a feeling OS X itself may become a second-class citizen behind iOS soon, but that, again, is only my personal opinion.

So there you have it. This is why I am going to be using a Lenovo notebook with NVDA as my primary screen reader for my private use from now on. I will still be using a Mac for work of course, but for my personal use, the Mac is being replaced. I want to be fast, effective, productive, and be sure my assistive technology doesn’t suddenly stop working with my braille display or be susceptible to business decisions placing less emphasis on it. Screen readers on Windows are being made by independent companies or organizations with their own business models. And there is choice. If one does no longer fit particular needs, another will most likely do the trick. I do not have that on OS X.