Review: Dell Latitude 10 inch tablet with Windows 8

At the 2013 CSUN conference, I was invited by Accessibility Partners to participate in an interview pertaining to tablet computer accessibility. Having used iPads for my personal use for years, and also having ventured into Android tablets such as the Google NEXUS 7 in my work for Firefox for Android, I was very interested to see what this would be about.

During the interview, the team showed me a Dell Latitude 10 inch Touch tablet running Windows 8. We talked about various aspects of general tablet accessibility, but also went into specifics of what the platform should offer.

It was one of those situations where, at the time of the interview, I actually said that built-in solutions would be sufficient if they offered all the accessibility features, and that the possibility to install additional assistive technologies would not be of highest importance. I said this being under the impression that, for the price that was quoted to me, this had to be a device similarly equipped like the Microsoft Surface RT.

However, as I found out later, the Dell Latitude comes with an Intel processor and doesn’t run Windows 8 RT, but rather a full version of Windows 8. One could even purchase Windows 8 Pro at an additional price while ordering.

A full Windows 8 gives one several possibilities over the RT variant:

  • the ability to run Windows 7 desktop applications.
  • being able to install additional or alternative screen reading or other assistive technology on it.

Both NVDA and Window-Eyes have versions out supporting the touch screen on Windows 8 devices. Freedom Scientific announced at CSUN that JAWS 15, coming out in the fall, will also support touch on Windows 8.

What makes the Dell tablet even more interesting is that it is priced at the same level as the Surface RT, while offering the above mentioned advantages. Unlike the Surface Pro, it also only has the weight of a Surface RT, and as far as I know, no moving fan inside.

After these facts set with me for a while, and I also had a chance to look at the Microsoft Surface during the same conference, I decided, contrary to my initial statement during the interview, to purchase one of the Dell tablet models. Especially the keyboards and the stand that folds out of the back of the Surface units, and which felt quite fragile to me, didn’t leave that good of an impression with me.

I went for the 64 GB Touch Essentials model, not needing any of the additional enterprise-centric features, but wanting a little more breathing room storage-wise.

Disclaimer: This review reflects purely my personal view on both Dell’s and Microsoft product offerings. I’m not being paid for writing this. It serves merely as information for me and interested readers.

Unboxing and first start

The tablet arrived on Monday. The accessories I had also ordered, the docking station and wireless keyboard and mouse, had arrived last week already.

The Latitude’s natural orientation is landscape, with a physical Start button being the anchor at the bottom center. On the left side there’s a Kensington lock slot, and the up and down volume buttons. At the top edge, towards the right-hand side, there are also two buttons: The one farthest to the right turns on and off orientation locking, the one next to it locks and unlocks the device. Next to these buttons is an SD card slot. You have to press down on it for the tray to come out of the casing. On the right side, there’s a standard 3.5 millimeter head phone jack and a standard USB port. On the back, there’s an 8 megapixel camera in the top center, and loud speakers in the bottom right and left corners. The tablet also has a front-facing camera, which is integrated into the glass surface, located in the top center. Below the Start button, on the bottom edge, is the connector for AC power or the docking station.

Photo of the front with the glass surface and the Start button
Front view of the tablet
Photo of the back side with the camera at the top and speakers in the bottom corners
Backside with speakers and camera
Side view of the tablet sitting in its docking station, slightly tilted backwards
Tablet sitting in its docking station

The device feels solid to the touch, no squishy parts. The glass surface also feels like high quality. The tablet’s thickness and weight are about that of the Apple iPad 3.

After turning it on and waiting for a minute or two, the Windows Setup wizard came up asking for the region and keyboard layout. At this point, I could press Start button+Volume up to start Narrator. It came up instantly, offering me immediate access to the setup wizard. It also gave instructions on what to do if one wasn’t familiar with the gestures. The basic gestures are the same as known from iOS or Android 4.1+: Touch anywhere will give you the object under the finger, swiping left and right moves by objects, and double-tapping activates the current object. This also accounts for the keyboard. Like on iOS, split tap is also available: Hold one finger on the screen and tap with a second to activate the current object. For keyboard input, one can also change the behavior to enter the key when the finger is lifted, but unlike on iOS, this is not available as a gesture to change. One has to go into the Narrator Settings dialog for Navigation to change this.

The setup was OK except for the fact that for the creation of a Microsoft account, one needs to solve a CAPTCHA. Since the audio didn’t work well for me at all, I had to get sighted assistance for this to be bypassed.

And after that, Windows did what it sometimes does: It gave me a dialog saying that the installation couldn’t be completed. After clicking the OK button, the machine rebooted and gave me the same Setup wizard again. No choices were remembered, except for my Microsoft account, which I did not have to recreate and thus not solve another CAPTCHA. On this second attempt, installation went through, and I was brought to the Start screen.

A post-PC device

One might think that this is just a notebook with a touch screen, having an Intel processor and running a full version of Windows 8. Using the wireless keyboard and mouse, one could arguably get the impression. But for all intents and purposes: This is a tablet. A what Steve Jobs once called post-PC device. Yes, Windows 7 desktop is available, and yes, you can interact with the computer using a mouse and keyboard. However, the start screen, AKA start menu, and all modern UI applications are very touch screen centric. If a Windows 8 modern UI app comes up, the best way to interact with it, even as a blind user, is by touch screen gestures. The keyboard often cannot reach all parts of the app, and using the mouse does not make sense for someone who cannot see the pointer. Using the touch screen, however, gives one direct access, an intermediate feeling, so to speak, for the application.

Here’s one general problem with Windows 8, though, and this affects sighted users probably more than blind ones: You never know what kind of UI will hit you next. To quote an example from this ZDNet article by Matt Baxter-Reynolds, you may be in a Windows 7 style e-mail client, double-click a photo and find yourself in a photo viewer that is Windows 8 modern, AKA Metro, style. There is no Close button, no task bar. To close it, on a touch screen, you swipe your finger down from the top, literally swiping the window off the screen. Using the mouse, you have to do the same. As a blind user, Alt+F4 will close Metro apps just like any other, and the Windows key will open the Start screen, so the learning curve for us is probably a bit lower than for sighted folks! 🙂

But this general usability problem can hit us, too: Metro apps may or may not be fully keyboard accessible. So to be on the safe side, if seriously running Windows 8, I strongly recommend purchasing either a tablet or a convertible laptop, meaning one that has a touch screen. Using parts of Windows 8 without one is not going to be much joy. Trust me, I tried it in a virtual machine on a MacBook for a couple of months.

The state of screen reader accessibility

For Windows 7 apps, there’s not much difference really to what you’re probably used to. For Metro apps, the experience can be quite different. Even among screen readers, using built-in or downloaded Metro apps is going to give you different experiences. The Store works quite reasonably with Narrator, but currently not so well with NVDA. Other apps may have labeled buttons, but rich list entries may not be properly coded so their contents makes sense. Other apps, such as the Amazon one, have touch-sensitive areas that cannot be activated by a screen reader. It is currently not possible to log into one’s Amazon account, for example.

The accessibility programming interface used for Metro apps is called UI Automation, an API that Microsoft has been sporting since Windows Vista days, and evolved over time. However, as I hear from various sources, implementing it in apps, or support for it in assistive technologies, is not trivial, and even inside Microsoft, not all apps do it consistently or correctly. And this shows in the user experience. For example, one can swipe to elements in IE that may be off-screen. Double-tapping them will bring up stuff that is totally different from what was intended. The reason is that IE doesn’t scroll the screen in accordance with what Narrator is reading. On Android, if something is not on the screen, it cannot be accessed and thus not be clicked. It has to be scrolled into view. In iOS, the operating system usually makes sure screen follows what VoiceOver tries to access. But this uncertainty in some apps really leaves one with a shaky feeling sometimes.

Also, I felt reminded of the old Windows screen reader days occasionally, when suddenly the whole application went blank because something apparently went wrong on the API side. Remember your Refresh the Screen key combinations? Probably won’t do much good here, but the experience is quite similar on occasion.

I also found that, with both NVDA and Narrator, which are the two screen reading solutions I tried so far, the operating system sometimes interprets dragging the finger around the touch screen as one of its standard “swipe in” gestures. I more than once accidentally made the Windows 7 desktop front window by swiping in from the left, when all I intended was swipe to go to the next object. Or I closed the current window by dragging my finger downwards vertically for too long. The fact that Windows doesn’t give whole control over the gestures to the assistive technologies, but reserves some to itself regardless, can make for some unpleasant surprises.

One problem I found with NVDA was that it sometimes tried to apply its hierarchical object navigator thinking too strictly, requiring me to use the swipe up and down gestures to go to parent/child objects. For a medium designed for random access, this should be more transparent. I should be touching what’s under my finger, not its container that encompasses a much bigger rectangle. While this approach may seem sane on OS X, for example, where the screen contents has to be broken down for the much smaller trackpad, interacting with the touch screen gives one a real-time one to one view of things, and one should not have to think about different object hierarchy levels then.

In conclusion

This is not meant to be extensive app-by-app testing. I simply haven’t had the tablet long enough to open and try every app that’s on there yet, or download too many of the apps from the Windows Store. But it certainly gave me a first impression already, and that’s what I wanted to share with you.

There definitely is room for improvement on many fronts, both on Microsoft’s side as well as that of assistive technologies. I believe the UIA API must solidify in not being so ambiguous, good, concise documentation must be available for developers to make their applications easily accessible, and accidental gesture activation must be guarded against better so blind users don’t unintentionally open or rearrange things they don’t intend to.

Like Android and its gradually closing gaps in accessibility, the accessibility of Microsoft’s Windows 8 migration onto post-PC devices definitely deserves commending and is developing into a real alternative to the all prominent iOS devices. In terms of accuracy while dragging one’s finger around the screen, and responsiveness, I’d say Windows 8 is even slightly ahead of Android.

So should you as a blind user go with Windows 8? I’d say: It depends. If your assistive technology supports it and you’re not afraid of using a touch screen, then for all means, it’s worth it. If you do not intend to use a device with a touch screen, have reservations etc., then better stick with Windows 7 for as long as possible. As soon as Windows 8 throws you into a Metro app, some things may not be so funny if keyboard accessibility isn’t implemented.

Show Comments