By Shelly Brisbin
May 16, 2023 5:01 AM PT
Apple previews new accessibility features, including AI-generated voice clone
For the third year running, Apple has offered a preview of accessibility features coming to its platforms later this year. The announcements were timed to mark global accessibility awareness day on May 18. The preview featured several completely new offerings aimed at people with cognitive and speech disabilities, plus updates to existing macOS and iOS features.
These preview announcements don’t come with beta software or release dates, but it’s understood that the new features will appear in forthcoming releases of macOS and iOS. In past years, accessibility updates for watchOS and tvOS have been previewed, but this time the focus was on the Mac, iPhone and iPad. It should be pointed out that these early announcements are often not the only accessibility updates in a given release cycle.
For those with cognitive disabilities, navigating the complex iOS interface a challenge. Assistive Access is a simplified, customized UI for the Home screen and some essential apps, including Phone, Messages, Photos, Camera, Music, and TV. Under Assistive Access, the Home Screen is limited to extra-large app icons for supported apps. The app interfaces are simplified, too, with larger text and bolder icons. A user or a caregiver can further set an Assistive Access app to display just the desired information, such as a select group of contacts.
Calls is an Assistive Access app that combines Phone and FaceTime. Messages can work with text, inline video, or an emoji-only keyboard that gives users who are not readers, or who can better communicate with symbols, an alternative to standard typing.
Photos and Music each display their contents in a grid that’s “flatter” in structure than the hierarchical interfaces the standard versions of those apps offer.
Assistive Access is the closest Apple has come to an interface designed specifically for people with disabilities or elders—an option that Android has offered via its support for alternative launchers. It will be interesting to see if it’s full-featured enough to not only support users with cognitive disabilities, but also offer a “grandparent-friendly” experience for those trying to choose between and iPhone and an Android phone.
Apple organizes its accessibility features and settings by functional categories: Vision, Hearing, Physical and Motor. Now there’s Speech, too. New features under the Speech heading support those who are partially or fully nonverbal. Personal Voice is an intriguing feature that might seem familiar to anyone who has experienced AI-based text-to-speech that’s been trained on an actual human voice.
Those diagnosed with ALS are at great risk for losing their ability to speak, but often have advance warning. Using Personal Voice, an individual will be able to use an Apple Silicon-equipped Mac, iPhone or iPad to create a voice that resembles their own. If the ability to speak is lost, text the user generates on the device can then be converted to voice, for use in a variety of ways. It will work with augmented communication apps that are often used to make it easier for people with limited speech to be understood. And no, you can’t create a new Siri voice this way. All Personal Voice training is done on-device.
Live Speech can use an existing Siri voice to give people with speech disabilities a quick way to use voice to express common phrases or sentences. Type and save a statement, like a food order or a greeting, then tap the text to have it spoken aloud. It works inside Phone and FaceTime, or in-person, and users can save common phrases.
The latest detection feature added to the Magnifier app–joining People Detection and Door Detection–is called Point and Speak. It’s designed to identify and read incidental text, like button labels you’d find on a vending machine or a kitchen appliance display. Like the other detectors, Point and Speak is aimed at blind and low-vision users, and requires a LIDAR-equipped device.
Based on the preview announcement, useing Point and Speak will feel similar to using Live text when combined with VoiceOver. What’s new here is that you can drag a finger around a display with multiple text labels and have each read aloud as you encounter it. That makes it a lot easier to correctly choose Coke, rather than accidentally pushing the Sprite button.
Hearing aids on Mac
Last year’s accessibility preview featured a handful of enhancements for hearing aid owners who use an iPhone. This year, Apple says support for Made for iPhone Hearing Aids is coming to the Mac. That’s been a long time coming. You’ll need an M1 or better Mac to make the connection, though.
Clever and targeted
This year’s preview also includes a grab bag of nice updates to existing accessibility features, including updated text size adjustments in macOS and tweaks to Siri voices for VoiceOver users who want to listen at extremely high speaking rates. Voice Control will add phonetic suggestions when editing text.
Several of the preview features clearly benefit from machine learning, and Personal Voice might be touted as an AI-based tool if it came from another company. It feels like an application of the technology that’s both entirely positive for the community it’s meant to serve, and reflective of a pretty nuanced understanding of what that community might want.
[Shelly Brisbin is a radio producer, host of the Parallel podcast, and author of the book iOS Access for All. She's the host of Lions, Towers & Shields, a podcast about classic movies, on The Incomparable network.]
If you appreciate articles like this one, support us by becoming a Six Colors subscriber. Subscribers get access to an exclusive podcast, members-only stories, and a special community.