Six Colors
Six Colors

This Week's Sponsor

Magic Lasso Adblock: YouTube ad blocker for Safari


By Shelly Brisbin

Vision Pro Accessibility in the Real(ish) World

An open Vision Pro settings menu for pointer control, with options for pointer size and animations visible, overlaying a blurred background of a snowy landscape with trees and mountains.

When it comes to accessibility, Apple is reliable. A person with disabilities who wants to use the company’s tech can count on assistive features with familiar names and functions being there across an array of platforms.

Though Macs got the VoiceOver screen reader first, the modern era of Apple access really began in 2009 when basic accessibility features came to the iPhone. Through the introductions of the iPad and Apple Watch and AppleTV models running software based on iOS, the accessibility suite has advanced, always building on the baseline, with just a few hiccups along the way. And once a feature debuts on one platform, it generally finds its way to all of them, with tweaks included to account for differences in the way you use a watch, a tablet or a computer.

So it isn’t surprising that Vision Pro accessibility builds on what’s gone before. What is surprising is the mix of real innovation here, along with some decidedly version 1.0 explorations of what’s possible.

Accessible Impressions: The Short Version

I don’t own a Vision Pro — my experience so far has only come at an Apple Store demo, so I have an incomplete picture. At my demo, I focused on whether I, with my particular flavor of low vision, could use this thing, and what my experience could say about how Apple has approached Vision Pro accessibility generally. This is my first chance to look into the future. I’ll need more time with the device before I can assess the usability of a challenging technology for everyone who wants that chance.

Going in, I was pretty sure that eye tracking would not be the way I interacted with the headset. I’m extremely near-sighted and sensitive to light. So I was prepared and warned Kevin, the retail employee setting up my demo, that I would most likely need to use VoiceOver or one of several pointer control methods that don’t require a well-aimed eye gaze. I didn’t opt to use VoiceOver beyond an initial run-through because I wanted to use my vision as much as I could, and there was more to learn about working around eye gaze than by using a screen reader. As an Apple product with its roots in iOS, the VoiceOver experience promised to be routine.

The accessibility experience began with device setup. Scanning my face to fit the light seal went smoothly, with Kevin guiding me as I turned my face to show the iPhone all angles. Without that help, I would have required VoiceOver, as I do when I set up Face ID on my own devices.

Next, I showed my high-powered reading glasses to Kevin. I don’t wear them when I use a computer or my phone, but they’re helpful when reading printed documents. They have thick prescription lenses, and I had guessed that there would be no Zeiss insert available at the store that matched them. Based on filling out the Zeiss prescription form for Vision Pro, it is possible to get an insert for my prescription. I was shocked by that, since they’re so heavy. Based on what I learned later in the demo, I feel relatively certain I could have benefited from an insert matching my lenses – at least for using the Vision Pro as a computer, if not for consuming entertainment.

Finally, because I wouldn’t be using eye tracking, I didn’t calibrate with the series of dots most users do, but simply matched my palms to onscreen prints to support the hand-based pointer options. The process was simple and quick, which was helpful since Kevin later had to reboot my device and redo the setup process when some of my gestures weren’t working.

Inside the Goggles

With accessibility at the forefront of what I wanted to accomplish, Kevin, whose iPad showed him what I was seeing and allowed him to guide the experience if need be, directed me to accessibility settings, where we enabled VoiceOver just long enough to show me how I could use it to move through the many other options. You can also use the screen reader to set up a Vision Pro independently, with a triple-click of the digital crown during startup. This option mirrors what’s available on an iOS device or Mac. You can use Siri to turn VoiceOver and most other accessibility features on and off quickly once you’ve got the device up and running.

Kevin showed me Zoom, which applies magnification to a frame onscreen by default. I found out later that you can zoom the full screen and that a turn of the digital crown will increase the zoom level in either mode. Sadly, the default-level zooming I did during my demo, without knowing how to increase magnification, didn’t make the text large enough for me to read, though I could see it and could select highlighted items with some guidance. There are also text size adjustments, which I wasn’t able to try at the store.

Next, Kevin showed me how to use pointer control to pick an alternative to eye tracking. Pointer control is the first instance in which the Vision Pro’s accessibility innovation really comes through. At different times during the demo, I used my wrist, index finger, or head to move a pointer that I could see onscreen. Like an iPad pointer controlled by a trackpad or mouse, the Vision Pro pointer is round and, to my eyes, visible but not very big. (It is apparently possible to adjust the size and color of the pointer.)

On a Mac, you can make the pointer larger and locate it with a gesture if you lose it onscreen. Those would be great features for a future Vision Pro update.

Each pointer control method was effective and a little mind-blowing at first. Pointing with my head was surprisingly effective, and during the half-hour of the demo at least, I never experienced motion sickness while I wore the headset.

A number of reviewers, including Jason, have written that a keyboard and trackpad are important accessories for Vision Pro productivity. I agree, and would especially urge those with disabilities that allow them to use input devices to get them. Though I didn’t have the chance to type on the virtual keyboard, I’ve seen it, and I’m certain that hardware is a far more accessible option. A trackpad and appropriately modified onscreen pointer would certainly make working on the device possible for me.

Sizzle and Substance

With a few accessibility settings enabled, I was ready to begin an abbreviated version of the canned Apple demo. I had difficulty making scrolling gestures work reliably, which meant less time moving through the selection of photos and Safari pages I was shown. I think a large part of the problem had to do with learning the rhythm of using my fingers or wrist to locate and select items and then the standard Vision Pro pinch and scroll gestures to act on them. Using VoiceOver instead of pointer control or having more time to practice all of these new gestures would almost certainly make me a more competent navigator, but in the demo environment, I was less than successful.

Knowing that the immersive video section was designed to create a mind-blowing finale to the demo, I let Kevin know that I was ready to stop struggling with Web pages and photo scrolling. We went back to the main app window, where I was told to open AppleTV+. Floating on top of a beautiful lake environment, the app window’s contrast wasn’t as pronounced as I needed it to be, and I couldn’t see app icons well enough to identify them. My guide patiently directed me to the app icon I wanted. Same for opening the demo video inside the Apple TV app. There was too much going on here, on a screen that was too far from my eyeballs, for me to select the item I needed to view. Kevin finally got me to the right place on screen, and I sat back for a much more relaxing few minutes in the company of Alicia Keys, birthday party kids, and a rhino.

The immersive video was great! I was able to see the 3D effects, experience the impossible viewing angles on a sports field, and feel as if I were truly inside the experience. Immersive video, and the Vision Pro as consumption device, I get!
Looking into Vision Pro was, I decided later, like looking through a window or watching TV from across the room.

To view a phone or a Mac, my eyes need to be a mere inch from the screen. So, interacting with settings and then the app screen was a hugely different experience than writing this article on my Mac. A high-power lens insert or full-screen zoom might make it possible to read, write, and navigate Vision Pro, but I’m not really sure about that. During my demo, I felt as if I could never get close enough to the screen. It’s one of the biggest variables in low vision. Where do you need to be, relative to the content you want to read or see, to get the best view? And whatever you do to find that perfect balance, how does the way you interact with a computer screen differ from the way you view a TV or movie screen? I don’t really want to wear reading glasses when I’m watching Avatar.

A Bigger Picture

As I wrote here back in June, a device that relies on eye tracking and fairly simple hand gestures as default modes of input offers great accessibility benefits to some people who can’t use keyboards, mice, and trackpads in the usual way. Evaluating how well the Vision Pro works as a computer alternative for someone with physical and motor needs is beyond the scope of my 30-minute, vision-focused demo, as well as my own life experience.

And it’s worth pointing out that features like Voice Control and AssistiveTouch, which are designed to make the iOS interface more usable by folks with physical and motor challenges, are all also there on Vision Pro. In terms of a sheer number of features, it’s an impressive collection of tools, made more by the fact that they’ve each been a part of Apple’s other platforms for multiple releases, giving users with a variety of needs the time to find their strengths and weaknesses.

Aspirational Accessibility

I know a number of blind or visually impaired folks who are excited about the Vision Pro. They’re Apple enthusiasts and embracers of new tech like most of us hanging out at Six Colors. But when I’ve asked blind colleagues what they want from Vision Pro, the conversation almost always moves to the future—not the opportunity to do computing tasks on a head-mounted device or even to watch a movie in the headset. Many blind people want Vision Pro to be an eyesight alternative or assistant, a way to see the world, identifying both the wondrous and the mundane.

Today, AI-powered iOS apps can describe a person, find a doorway, or read just the important bits of a food label aloud. A combination of a device camera, AI, and human interaction can assist a blind person in navigating to an appointment on foot or in a vehicle. Eventually, my blind friends assume, Vision Pro will be able to do all of these things, essentially allowing the wearer to “look” at the world around them and extract knowledge from it in meaningful ways. Eventually, these users assume Vision Pro’s cameras will be able to describe what they see. Developers will be able to harness camera data to interpret images beyond what’s literally visible. And the headset, or a smaller, lighter reincarnation of it, will be suitable for walking around outdoors.

For the blind tech enthusiasts I know, that’s the ultimate promise of Vision Pro. For now, it’s an expensive way to do what many are already doing with devices they carry. But one that offers both a solid start on the road to full accessibility and an enticing treasure map to the future.

Shelly discusses this issue more on the latest episode of the Parallel podcast.

[Shelly Brisbin is a radio producer, host of the Parallel podcast, and author of the book iOS Access for All. She's the host of Lions, Towers & Shields, a podcast about classic movies, on The Incomparable network.]

If you appreciate articles like this one, support us by becoming a Six Colors subscriber. Subscribers get access to an exclusive podcast, members-only stories, and a special community.


Search Six Colors