Six Colors
Six Colors

This Week's Sponsor

Magic Lasso Adblock: YouTube ad blocker for Safari


By Shelly Brisbin

Apple’s Accessibility feature preview gets #GAAD going

For the second year running, Apple has offered a preview of updated accessibility features coming to its platforms later this year. The announcements come just ahead of Thursday’s Global Accessibility Awareness Day, which goes by #GAAD online.

The preview is notable for spotlighting features that most people won’t use, but that matter a lot to those with disabilities. It’s also notable because it’s a departure from the company’s typical close-to-the-vest approach to what’s coming on the roadmap.

Here’s a look at what Apple announced, and why it matters:

Door Detection. LIDAR-equipped iPhones and iPads have had a feature called People Detection since iOS 13. Using the Magnifier app and the VoiceOver screen reader, a blind user can learn whether the device camera sees a person, and where that person is in relation to the device. That was handy for social distancing. Door Detection will use the same mechanism to alert you when the device identifies the presence of a door. That’s a more practical use of LIDAR for many blind and low-vision users than even People Detection, both indoors and out. Door Detection can tell you about the door’s attributes, including whether it’s open or closed and if there’s a doorknob. Apple says you’ll also be able to read signage, like a room number or a notice on the door. I presume that’s just an application of Live Text, but it’s a great companion for Door Detection.

The use of LIDAR in accessibility has always felt like a bit of a preview of what we might see in future Apple glasses or headset, and it’s encouraging for users of accessibility features that the company is potentially taking their needs into account as it develops future wearables. My hope is that LIDAR sensors, only available in the highest-end phones and iPads, will come to more of the iOS product line. For a blind user who doesn’t buy a phone based on high-end camera features, doing so just to get access to LIDAR-based accessibility features is a tough sell.

Live Captions. Apple joins Google, Microsoft and Zoom, among others, in offering live captions, but they’re global on iOS and macOS, so you can use them in any app with audio output. That’s the superpower here. Just pick up your device and enable captions, whatever you’re doing. Deaf and hard-of-hearing people often bemoan the state of auto-generated captions, so some testing will be warranted.

Watch accessibility improvements. Last year’s May accessibility preview, which I covered on my podcast, Parallel, brought AssistiveTouch the the Apple Watch. It’s a longstanding iOS feature that provides a simplified way to perform gestures for those with motor disabilities. This year, there are more watch gestures, called Quick Actions, and a new feature called Apple Watch Mirroring.

If you have a motor disability, Quick Actions gives you the choice to make hand gestures instead of manipulating the watch screen. An all-purpose “double-pinch” gesture will answer a call, control media playback, or take a photo. Mirroring is like AirPlay for the Apple Watch, sending the watch screen to your phone. That’s also useful for people with motor disabilities who can more easily use the phone screen that the smaller, more inconveniently-located watch face.

I’m intrigued by the possibilities for low-vision users, too, because the phone screen is sometimes far easier to use at close range and in zoomed mode than the watch. And you can use AssistiveTouch or Switch Control, if that’s how you interact with your phone.

Buddy Controller. Turn two game controllers into one, so two people can play a game together, with one able to assist someone with a physical disability who has difficulty manipulating some or all of the controller’s features.

Siri Pause Time adjustment. If your speech is slow or impaired, having a little extra time to complete your thought before Siri tries to act on it could make it a more useful tool.

Customization for Sound Recognition. Introduced in iOS 13, Sound Recognition allows your device to listen for sounds in your environment, like water running, a baby crying or a siren, and then notify you with a visual alert. It’s a useful tool for getting the attention of someone who’s deaf or hard of hearing. But you’re currently limited to one of 15 sounds. It’s a good list, but what if a sound you needed to know about isn’t on the list? Apple says that later this year, you’ll be able to record and save sounds you’d like to use with Sound Recognition. (Perhaps you have a unique doorbell or an appliance with a special trill?) Customization probably should have been part of Sound Recognition to begin with, but it’s common for Apple to roll out a totally new accessibility feature, then build its capabilities over time.

Apple detailed a few other new features on Tuesday, including 20 more languages for the VoiceOver screen reader and Voice Control Spotlight mode, which you can use to dictate custom word spellings.

Big Deal or No Big Deal?

This is a nice grab bag of features, with Door Detection and the Apple Watch updates offering the most intriguing possibilities. It’s also possible there are more where these came from, as occasionally happens when the late betas start to become available.

[Shelly Brisbin is a radio producer, host of the Parallel podcast, and author of the book iOS Access for All. She's the host of Lions, Towers & Shields, a podcast about classic movies, on The Incomparable network.]

If you appreciate articles like this one, support us by becoming a Six Colors subscriber. Subscribers get access to an exclusive podcast, members-only stories, and a special community.


Search Six Colors