By Shelly Brisbin
October 25, 2023 8:52 AM PT
The iPhone 15 Pro brings tangible accessibility benefits
What makes an iPhone accessible? Mostly, it’s the software: the operating system and apps that follow guidelines Apple has set for that purpose. If new hardware plays a role, it’s often in the opposite direction. For some with hearing or vision disabilities, the loss of the Home button has made hanging onto an iPhone SE feel worth the struggle.
But this year, there’s another hardware story (and I’m mighty surprised to be writing this): What’s inside Apple’s Pro iPhones is giving an important boost to usability for people with disabilities.
The Action Button
Turning the tactile, easy-to-use ring/silent switch into a multifunction button you can use to launch a shortcut, open the camera, or fire up Voice Memos is a fair way of giving back where something’s been taken away. There’s also a whole screen full of accessibility features you can choose to assign to the Action button: everything from turning on the VoiceOver screen reader or Live Captions to adding a color filter or starting AssistiveTouch.
Quick access to most of these options isn’t new: you can use Back Tap or Accessibility Shortcut (a triple-click of the Side button) to summon a lot of them. But particularly if you have a motor disability, the choice between invoking your preferred feature by tapping, triple-clicking, or a press and hold of the Action Button is just one more level of flexibility, not to mention the chance to program quick access to various tools at the same time.
Since you can launch a shortcut with the Action Button (also available via Back Tap), there’s no end to the ways you can customize your own accessibility by doing things more quickly. I have a blind friend who’s using the Action Button to quickly toggle the speed of podcast playback between two favorite settings. Using a shortcut means she need not open Overcast using VoiceOver and then swipe to the speed slider every time she wants to make a change. Sometimes, accessibility means saving steps.
The accessibility options available for the Action Button also come with brief descriptions that could introduce or explain these features to people who have never dived levels deep into Accessibility settings.
Wideband and Precision Finding
Precision finding is great for anyone who’s looking for their phone or for a friend in a crowd. But how great is it when one or more of the parties doing the looking is blind?
Find My was never precise enough to lead anyone to the exact restaurant table or funnel cake booth. But precision finding, powered by the second-generation Ultra Wideband chip, gives guidance with sound and VoiceOver speech.
Intelligent Portrait Mode and Other Photo Magic
If taking photos isn’t a big part of the way you use your phone, an iPhone 15 Pro or Pro Max could seem like overkill. For accessibility-related uses of the camera, like magnification, scanning text, and using AI or human assistants to describe or analyze a scene, an older camera system works just as well as this year’s best. But as cameras get better, so do the machine-learning tools these phones offer to improve your photos.
A visually impaired photographer can use any of the iPhone 15 cameras with more confidence, knowing that after-the-fact portrait mode (and other enhancements to an image he or she has already taken) are available. That’s a kind of accessibility of opportunity masquerading as a mainstream camera feature.
Pro phones have included a LIDAR sensor since the iPhone 12 range. It’s a camera system thing. But like the Ultra Wideband chip, LIDAR can help detect what’s in your environment.
In each generation of the Pro phones since the 12, LIDAR has given Pro users access to new detection features. First, you could have what the camera saw described to you: a red car on a dirt road, a wooden table with keys, and a pair of glasses. Then came People Detection, just in time for the mid-pandemic. LIDAR and the Magnifier app let you identify the presence of a person and how far that person is from you. Door Detection was next. Aim your phone at a building or down a hallway and find out not only where doors are located but whether they’re open or closed, wood or glass, have signage or not, and what that signage says.
This year, we got the unfortunately named Point and Speak. It’s not a child’s toy but a tool for reading text labels. Hold your phone up to a microwave, washing machine, or other gadget, and point to where you think a label is located. The phone helps you aim and reads text it finds over or under (you choose) your pointing finger. Great for a microwave, dishwasher, or any device with buttons to be pressed.
None of the detection features is perfect. Point and Speak, in particular, could use some seasoning. But these features give some insight into how Apple teams working on accessibility have been able to weave interesting features with relatively small potential user bases into the ways iPhone hardware uses develop over time.
Is it also proof of concept for greater things to come? You bet. But for at least a few people where accessibility is a main requirement, it’s also reason enough to splurge on an iPhone 15 Pro or Pro Max right now.
[Shelly Brisbin is a radio producer, host of the Parallel podcast, and author of the book iOS Access for All. She's the host of Lions, Towers & Shields, a podcast about classic movies, on The Incomparable network.]
If you appreciate articles like this one, support us by becoming a Six Colors subscriber. Subscribers get access to an exclusive podcast, members-only stories, and a special community.