Six Colors
Six Colors

by Jason Snell & Dan Moren

This Week's Sponsor

Unite 4 - Turn websites into apps on your Mac.

By Shelly Brisbin

Apple previews new accessibility features, including AI-generated voice clone

Live Speech

For the third year running, Apple has offered a preview of accessibility features coming to its platforms later this year. The announcements were timed to mark global accessibility awareness day on May 18. The preview featured several completely new offerings aimed at people with cognitive and speech disabilities, plus updates to existing macOS and iOS features.

These preview announcements don’t come with beta software or release dates, but it’s understood that the new features will appear in forthcoming releases of macOS and iOS. In past years, accessibility updates for watchOS and tvOS have been previewed, but this time the focus was on the Mac, iPhone and iPad. It should be pointed out that these early announcements are often not the only accessibility updates in a given release cycle.

Assistive access

Assistive Access for Messages

For those with cognitive disabilities, navigating the complex iOS interface a challenge. Assistive Access is a simplified, customized UI for the Home screen and some essential apps, including Phone, Messages, Photos, Camera, Music, and TV. Under Assistive Access, the Home Screen is limited to extra-large app icons for supported apps. The app interfaces are simplified, too, with larger text and bolder icons. A user or a caregiver can further set an Assistive Access app to display just the desired information, such as a select group of contacts.

Calls is an Assistive Access app that combines Phone and FaceTime. Messages can work with text, inline video, or an emoji-only keyboard that gives users who are not readers, or who can better communicate with symbols, an alternative to standard typing.

Assistive Access in Photos

Photos and Music each display their contents in a grid that’s “flatter” in structure than the hierarchical interfaces the standard versions of those apps offer.

Assistive Access is the closest Apple has come to an interface designed specifically for people with disabilities or elders—an option that Android has offered via its support for alternative launchers. It will be interesting to see if it’s full-featured enough to not only support users with cognitive disabilities, but also offer a “grandparent-friendly” experience for those trying to choose between and iPhone and an Android phone.

Speech accessibility

Personal Voice

Apple organizes its accessibility features and settings by functional categories: Vision, Hearing, Physical and Motor. Now there’s Speech, too. New features under the Speech heading support those who are partially or fully nonverbal. Personal Voice is an intriguing feature that might seem familiar to anyone who has experienced AI-based text-to-speech that’s been trained on an actual human voice.

Those diagnosed with ALS are at great risk for losing their ability to speak, but often have advance warning. Using Personal Voice, an individual will be able to use an Apple Silicon-equipped Mac, iPhone or iPad to create a voice that resembles their own. If the ability to speak is lost, text the user generates on the device can then be converted to voice, for use in a variety of ways. It will work with augmented communication apps that are often used to make it easier for people with limited speech to be understood. And no, you can’t create a new Siri voice this way. All Personal Voice training is done on-device.

Live Speech can use an existing Siri voice to give people with speech disabilities a quick way to use voice to express common phrases or sentences. Type and save a statement, like a food order or a greeting, then tap the text to have it spoken aloud. It works inside Phone and FaceTime, or in-person, and users can save common phrases.

More detection

The latest detection feature added to the Magnifier app–joining People Detection and Door Detection–is called Point and Speak. It’s designed to identify and read incidental text, like button labels you’d find on a vending machine or a kitchen appliance display. Like the other detectors, Point and Speak is aimed at blind and low-vision users, and requires a LIDAR-equipped device.

Based on the preview announcement, useing Point and Speak will feel similar to using Live text when combined with VoiceOver. What’s new here is that you can drag a finger around a display with multiple text labels and have each read aloud as you encounter it. That makes it a lot easier to correctly choose Coke, rather than accidentally pushing the Sprite button.

Hearing aids on Mac

Last year’s accessibility preview featured a handful of enhancements for hearing aid owners who use an iPhone. This year, Apple says support for Made for iPhone Hearing Aids is coming to the Mac. That’s been a long time coming. You’ll need an M1 or better Mac to make the connection, though.

Clever and targeted

This year’s preview also includes a grab bag of nice updates to existing accessibility features, including updated text size adjustments in macOS and tweaks to Siri voices for VoiceOver users who want to listen at extremely high speaking rates. Voice Control will add phonetic suggestions when editing text.

Several of the preview features clearly benefit from machine learning, and Personal Voice might be touted as an AI-based tool if it came from another company. It feels like an application of the technology that’s both entirely positive for the community it’s meant to serve, and reflective of a pretty nuanced understanding of what that community might want.

[Shelly Brisbin is a radio producer, host of the Parallel podcast, and author of the book iOS Access for All. She's the host of Lions, Towers & Shields, a podcast about classic movies, on The Incomparable network.]


By Shelly Brisbin

Tracking the many social-media migrations

Twitter (left), Mastodon (center), and Bluesky (right).
Twitter (left), Mastodon (center), and Bluesky (right).

Last week, I joined Bluesky, the new hotness in social platforms. It’s the latest refuge for those who have beef with the way Elon Musk runs Twitter. Last November, I queued up to rejoin Mastodon, the existing-but-revitalized platform that was the first beneficiary of agita over Musk’s Twitter takeover.

Despite boarding the outbound train relatively early, I still maintain Twitter accounts for myself and the things I make. All that is to say, I’m experiencing these three platforms all at once and finding them very different from one another in more than the obvious ways.

Being on Mastodon feels different than being on Bluesky, which is not like today’s Twitter. This despite the fact that a lot of people besides me appear to maintain accounts and even continue posting on all three. In my feed, the multiplatformers tend to be journalists of the tech and general-interest varieties, along with an array of other content creators.…

This is a post limited to Six Colors members.


By Shelly Brisbin

Video: Using VoiceOver with the Weather app

Hey there, I’m Shelly Brisbin and I’m here to demo a new feature in iOS 16.4. Specifically, it’s in the Weather app, and even more specifically than that, it is for people who use the VoiceOver screen reader that’s part of iOS.

VoiceOver is a tool used by blind and visually impaired folks to hear the contents of the screen rather than viewing them. So using a combination of speech and touch, you can find out what’s under your fingers. That’s pretty easy if the content is text or even something that can be described pretty easily like a button, something that you can label. But if you have something like an overlay from inside the Weather app, it’s a little harder to use VoiceOver to describe it because it’s visual and it’s color coded. So what I’m going to show you is what I call “sonic overlay.” It’s an overlay that uses pitch to indicate the level of rain that an area is having right now.

[Shelly Brisbin is a radio producer, host of the Parallel podcast, and author of the book iOS Access for All. She's the host of Lions, Towers & Shields, a podcast about classic movies, on The Incomparable network.]


By Shelly Brisbin

Checking in on the accessibility of the Dynamic Island

The dynamic island selected by VoiceOver
With the Dynamic Island expanded as a song plays, you can select the scrubber with VoiceOver, and flick up or down to move through the track.

Every hardware innovation from Apple brings some version of this question to my mind: “Yeah, but how does it work with VoiceOver?” Or any other relevant accessibility tool, for that matter. Before I got my hands on them, I asked this question about Apple Watch, Apple TV, and even the MacBook Pro’s touch bar. And as part of my ongoing adventure in documenting everything the iOS platform offers for accessibility, I needed to pay a visit to the iPhone Pro’s Dynamic Island.

The island is an interface that also hides Face ID and the front-facing camera. You’re meant to interact with the pill-shaped space visually, glancing at it to unlock the phone, or to gather tiny bits of information about what’s going on in a supported app.

phone info in Dynamic Island
During a phone call, the Dynamic Island shows its duration. The heavy border indicates the island is selected by VoiceOver.

Now, being visual doesn’t make an iOS feature inaccessible. Far from it. But it raises questions for the accessibility nerd about how information will be delivered to to a non-visual user, and what gestures are needed to get and control it. Most times, it’s extremely straightforward — notifications can be read by the VoiceOver screen reader, and they can be interpreted by other accessibility tools, too. You can have VoiceOver speak the notifications if you want. (I do not want, by the way.)

But my curiosity about the Dynamic Island centered on the seemingly incidental nature of the data offered – the status of a phone call, duration of a timer, or what song Music is playing. It’s not necessarily a notification, meant to capture your attention. And even if Dynamic Island could give up its secrets to VoiceOver, would the user get anything from it they couldn’t find elsewhere? Can a VoiceOver user save taps and swipes with Dynamic Island the way a non-VoiceOver user can? Is Dynamic Island a selling point for the Pro phones if you’re blind?

Is it, or isn’t it?

First of all, the Dynamic Island is accessible. If an app puts content there, VoiceOver will be able to read/speak it. The screen reader does not announce that there’s currently information on the Dynamic Island, a la notifications. But you can flick through the status bar or explore by touch – essentially, drag your finger in the general vicinity of the island until you hear its contents spoken. When you encounter Dynamic Island with VoiceOver on, it’s selected, just like any other screen element. That means you can double-tap (the VoiceOver equivalent of a single-tap) to open the current host app, or use a double-tap-and-hold or a rotor action to expand the island’s display. Either is the equivalent of a standard long press. The rotor also includes Activate and Dismiss actions, to open the app or empty the island’s contents.

Expanding the Dynamic Island gives VoiceOver access to whatever control the current app provides. Adjust volume in Music, mute your mic during a phone call, switch to the Remote app while AirPlaying to an Apple TV. Just as in any app, the controls are accessible to VoiceOver with a double-tap, and you can collapse the display again by double-tapping outside it.

Flexible little pill

Dynamic Island showing a timer and music.
A timer is active as a song plays. The timer is selected with VoiceOver.

I set a lot of timers: sometimes it’s my sandwich in the toaster oven. Or maybe I’m reading a script aloud and need to know how long it took. An active timer’s countdown appears in the island. Just pass your finger over the pill to have VoiceOver read the display.

If something else, like Music, is already displaying data there, a timer button appears in a space of its own, to the right of the main Dynamic Island. Visually, the timer button updates with a visual representation of how much time remains. To access the countdown with VoiceOver, you’ll need to expand the timer. It would be great to have VoiceOver read time remaining when I flick to the unexpanded timer button.

Audio selected in the Dynamic Island
As an audio file plays, album artwork, if available, and a waveform are visible. This one is selected by VoiceOver, which reads the track’s name and artist aloud.

There’s one way in which the Dynamic Island experience with VoiceOver is superior: If you’re listening to audio, whether from Music, Spotify, Overcasts, Audible, or another supported app, swiping into the island causes VoiceOver to read the track and artist. Visually, there’s only a tiny album art thumbnail at one end, and a pulsing waveform indicator on the right, that tells you something’s playing.

Island on the go

Maps has a few Dynamic Island tricks. Just as the app will display a card if you’re not in the Maps app as you navigate, Dynamic Island delivers tiny status updates instead, including arrows indicating turn direction and distance. VoiceOver speaks these, just as it does when cards appear onscreen. If you expand the island’s contents, you can even end a route immediately. This is a real time-saver over returning to Maps, pulling up the card and choosing the End Route command.

More and more third-party apps offer Dynamic Island support. For a busy traveler using VoiceOver, FlightAware’s updates save both time and lots of flicks and taps. If I were a baseball fan, I’d want my scores flashed atop my phone screen, once again saving me the trouble of digging into the app that provides them.

Though Dynamic Island might not be enough of a justification on its own for some people to splurge on an iPhone Pro, it’s not only well-implemented for VoiceOver, but occasionally provides info more quickly than a host app can.

[Shelly Brisbin is a radio producer, host of the Parallel podcast, and author of the book iOS Access for All. She's the host of Lions, Towers & Shields, a podcast about classic movies, on The Incomparable network.]


By Shelly Brisbin

The real secrets of iOS and accessibility

To change text size for notifications, open Control Center’s Text Size option on the Home screen or Lock screen,and choose Home Screen Only.

There’s a joke I tell a lot: if you encounter an article whose headline includes the words “secret features” and “iOS”, chances are you’re about to be taken on a whirlwind tour of your phone’s accessibility settings. “Did you know you could….?” Or. “Buried deep in iOS settings, you’ll find…”

Truth is, these aren’t secret features at all; they’re just unfamiliar to people whose eyes, ears and hands operate in a typical way. And these “secrets” are rarely written about, even in comprehensive coverage of iOS. “Invisible” might be a more honest way to describe these tools.

I can make a better case for the secret feature moniker when it comes to little-known ways you can use the accessibility suite to do typical iOS tasks, whether you have a disability or not. iOS accessibility has layers, is what I’m sayin’. So let us peel some back.

Continue reading “The real secrets of iOS and accessibility”…


By Shelly Brisbin

SearchLink removes the drudgery from making web links

My classic movie podcast, Lions, Towers & Shields over at The Incomparable, uses a standard show-notes format. On each episode, we talk about one old movie, and I always link to places where you can stream it, buy physical media, and learn more about the film we’re watching. Even after I came up with a standard Markdown template I liked, I found myself doing a whole lot of tinkering, mostly Web searches, for each episode’s notes. So I decided to automate the process. I now have a macOS workflow that saves me lots of time.

My TextExpander snippet contains Markdown for my show notes, including SearchLink codes and fill-in fields for the movies.

I initially dropped my Markdown template for the show notes into a TextExpander snippet with a fill-in field for the movie title. I put in placeholders for all my links and their labels. I’d expand the snippet into a Drafts note, launch a browser, and open a folder full of the bookmarks I needed to perform my searches – IMDB, YouTube, Amazon, the past LTS catalog.…

This is a post limited to Six Colors members.


By Shelly Brisbin

Apple Maps’ missing transit link

Apple Maps triptych
Here’s the same location, shown in Explore, Driving, and Transit modes in Maps. Each screen is a bit zoomed in, and they’re all shown in Dark Mode. Note the low-contrast gray-on-gray of the Transit view.

I am a frequent transit user — or I was before the pandemic. That distinction is important because, on a pair of recent trips, I came back into content with transit and Apple Maps in a way that left me scratching my head. Was it like this before?

Maps image of walking on a dotted line.
Here’s what you see when you plot a transit trip and get off the bus a few blocks from your destination. Not terrible, but a turn-by-turn option with higher contrast text would be better.

When I’m in Austin, Texas, where I live, I use the excellent Transit app to find bus and train connections. Mostly, I’m looking for departure times and connections. I don’t really need turn-by-turn directions, because I know where the Republic Square station is, thank you very much.…

This is a post limited to Six Colors members.


By Shelly Brisbin

OmniFocus gains cross-platform voice automation

The latest update to the Omni Group’s OmniFocus task manager has a mild-mannered version number – 3.13 – but it includes one pretty big new feature. You can now control the app (on iPhone, iPad, or Mac) using your voice, with an assist from JavaScript-based Omni Automation features and the Voice Control accessibility feature that’s built into macOS and iOS.

Voice Control is an accessibility feature designed for those with physical disabilities, like motor delays that make it difficult to use gestures or a keyboard. Once Voice Control is enabled, you can control the Mac or iOS interface and dictate and correct text. It goes well beyond what you can do with Siri. Voice Control works inside apps, too, but OmniFocus has beefed up that support with hooks to its own automation scheme.

With Voice Control enabled, and a set of scripts provided by Omni, you can speak commands to create OmniFocus tasks, update or refer them, or hear which tasks are due, among many other options.

You can also create custom commands, or use Voice Control commands to trigger shortcuts.

Omni says Voice Control support is coming to its other apps, beginning later this year. Besides OminFocus, the company’s productivity tools include OmniOutliner, OmniPlan and OmniGraffle.

The OmniFocus 3.1.3 update is free to subscribers. Voice Control requires macOS Monterey or later, or iOS 15 or later. Omni has posted a set of Voice Control automation demo videos.

[Shelly Brisbin is a radio producer, host of the Parallel podcast, and author of the book iOS Access for All. She's the host of Lions, Towers & Shields, a podcast about classic movies, on The Incomparable network.]


By Shelly Brisbin

Apple’s Accessibility feature preview gets #GAAD going

For the second year running, Apple has offered a preview of updated accessibility features coming to its platforms later this year. The announcements come just ahead of Thursday’s Global Accessibility Awareness Day, which goes by #GAAD online.

The preview is notable for spotlighting features that most people won’t use, but that matter a lot to those with disabilities. It’s also notable because it’s a departure from the company’s typical close-to-the-vest approach to what’s coming on the roadmap.

Here’s a look at what Apple announced, and why it matters:

Door Detection. LIDAR-equipped iPhones and iPads have had a feature called People Detection since iOS 13. Using the Magnifier app and the VoiceOver screen reader, a blind user can learn whether the device camera sees a person, and where that person is in relation to the device. That was handy for social distancing. Door Detection will use the same mechanism to alert you when the device identifies the presence of a door. That’s a more practical use of LIDAR for many blind and low-vision users than even People Detection, both indoors and out. Door Detection can tell you about the door’s attributes, including whether it’s open or closed and if there’s a doorknob. Apple says you’ll also be able to read signage, like a room number or a notice on the door. I presume that’s just an application of Live Text, but it’s a great companion for Door Detection.

The use of LIDAR in accessibility has always felt like a bit of a preview of what we might see in future Apple glasses or headset, and it’s encouraging for users of accessibility features that the company is potentially taking their needs into account as it develops future wearables. My hope is that LIDAR sensors, only available in the highest-end phones and iPads, will come to more of the iOS product line. For a blind user who doesn’t buy a phone based on high-end camera features, doing so just to get access to LIDAR-based accessibility features is a tough sell.

Live Captions. Apple joins Google, Microsoft and Zoom, among others, in offering live captions, but they’re global on iOS and macOS, so you can use them in any app with audio output. That’s the superpower here. Just pick up your device and enable captions, whatever you’re doing. Deaf and hard-of-hearing people often bemoan the state of auto-generated captions, so some testing will be warranted.

Watch accessibility improvements. Last year’s May accessibility preview, which I covered on my podcast, Parallel, brought AssistiveTouch the the Apple Watch. It’s a longstanding iOS feature that provides a simplified way to perform gestures for those with motor disabilities. This year, there are more watch gestures, called Quick Actions, and a new feature called Apple Watch Mirroring.

If you have a motor disability, Quick Actions gives you the choice to make hand gestures instead of manipulating the watch screen. An all-purpose “double-pinch” gesture will answer a call, control media playback, or take a photo. Mirroring is like AirPlay for the Apple Watch, sending the watch screen to your phone. That’s also useful for people with motor disabilities who can more easily use the phone screen that the smaller, more inconveniently-located watch face.

I’m intrigued by the possibilities for low-vision users, too, because the phone screen is sometimes far easier to use at close range and in zoomed mode than the watch. And you can use AssistiveTouch or Switch Control, if that’s how you interact with your phone.

Buddy Controller. Turn two game controllers into one, so two people can play a game together, with one able to assist someone with a physical disability who has difficulty manipulating some or all of the controller’s features.

Siri Pause Time adjustment. If your speech is slow or impaired, having a little extra time to complete your thought before Siri tries to act on it could make it a more useful tool.

Customization for Sound Recognition. Introduced in iOS 13, Sound Recognition allows your device to listen for sounds in your environment, like water running, a baby crying or a siren, and then notify you with a visual alert. It’s a useful tool for getting the attention of someone who’s deaf or hard of hearing. But you’re currently limited to one of 15 sounds. It’s a good list, but what if a sound you needed to know about isn’t on the list? Apple says that later this year, you’ll be able to record and save sounds you’d like to use with Sound Recognition. (Perhaps you have a unique doorbell or an appliance with a special trill?) Customization probably should have been part of Sound Recognition to begin with, but it’s common for Apple to roll out a totally new accessibility feature, then build its capabilities over time.

Apple detailed a few other new features on Tuesday, including 20 more languages for the VoiceOver screen reader and Voice Control Spotlight mode, which you can use to dictate custom word spellings.

Big Deal or No Big Deal?

This is a nice grab bag of features, with Door Detection and the Apple Watch updates offering the most intriguing possibilities. It’s also possible there are more where these came from, as occasionally happens when the late betas start to become available.

[Shelly Brisbin is a radio producer, host of the Parallel podcast, and author of the book iOS Access for All. She's the host of Lions, Towers & Shields, a podcast about classic movies, on The Incomparable network.]


By Shelly Brisbin

When in doubt, push the button

Selecting a channel in Audition. It’s ready for EQ and limiting.

​Among a group of my friends and fellow podcasters, a certain kitchen gadget is known simply as the “cult device.” We enjoy the Instant Pot, but are far too cool to openly admire a mere hunk of metal and circuitry… that was not made by Apple.

In my corner of the tech-creative world, there is another cult device – Elgato’s line of Stream Deck products. Perhaps you’ve heard of them!

You may also have heard that fans of the Stream Deck tend to start small with the six-button Stream Deck Mini, later palming that little fellow off on a new convert, moving up to a model with 15 or 32 buttons. For awhile, there was a brisk trade in these things among my extended circle. I purchased a Mini.

Continue reading “When in doubt, push the button”…


By Shelly Brisbin

How (and why) I publish a book every summer


Each year when Apple’s WWDC wraps up, I find myself doing what a lot of app developers do: planning my response to the upcoming version of iOS. But my summers aren’t consumed by Xcode or SwiftUI. My annual contribution to the Apple economy is a book called iOS Access for All: Your Comprehensive Guide to Accessibility for iPad, iPhone and iPod Touch. You think that title’s long? The iOS 15 edition weighs in at 215,000 words, 11 chapters, and four appendices—all written, assembled and sold by me.

Realizing I’d just published the book’s ninth edition got me thinking about how this all happened, and the extent to which I’ve managed to streamline many parts of the publishing process, while hopefully living up to the expectations of a group of readers who don’t get much attention from other iOS books and overviews.

Beginnings

Before I focused on iOS or accessibility, I wrote books about the Mac, web development, and wireless networking. I worked with book publishers whose end products were heavy, paper volumes you could find on a bookstore shelf. Even in 2012, when I first had an idea for a book about accessibility on Apple platforms, I knew the publishing model I had worked under for 10 years or so had changed dramatically. Who needs Mac Answers when you have Google?

I learned soon after iOS became accessible in 2009 that good search craft can’t find what isn’t online. Lots of knowledge about how to use Apple’s accessibility tools was ephemeral. And what was available quickly went out of date with no one having the incentives or a mandate to fix it.

I decided there should be a book about iOS accessibility, written by an experienced tech author. So, as you do, I pitched some publishers. But despite my track record and an opportunity to own a corner of the market, the answer was no. These publishers didn’t think they could sell enough copies of a book focused solely on accessibility.

It’s possible they were entirely right. But as I started embedding myself into the accessibility community, I became aware that publishers really didn’t know who might want the book I planned to write, or how to sell it to them. At least I had the advantage of being a daily user of some of what I planned to write about. I learned a lot by listening – over coffee and on Twitter – to people who make and use accessible tech every day. They reinforced in me the need to make the book, even if I couldn’t yet prove a market existed.

Self-publishing

The other thing you need to know about my plan to publish a book on my own is that I was on a budget. I would need to spend as little money as possible while still doing a professional job. I was a freelancer planning to devote months of full-time work to this endeavor. And I knew I would need to pay for things like a cover design and a copy editor. I also gave myself a travel budget, but not one for software.

At the last couple of Macworld Expos, I sat in on ebook publishing sessions, where I learned the gospel of ePub as a flexible online format that Apple was already using in what was then the iBooks Store. And from accessibility experts, I learned that it was an important part of special-purpose devices that turn text into audio for blind users.

Coming into the 2013 iOS release cycle, I had a topic, a format and publishing strategy, and a foothold in a community I would need to sell the book.

Maximum Accessibility

A major advantage of ebooks over paper ones is that they’re accessible on devices, where screen readers and text visibility options break down the barriers between readers and content. And you’ll find free ePub readers on all platforms.

As a book builder, ePub turned out to be a great choice, There are lots ways you can build a good-looking ePub. Structurally, an ePub is just a bunch of XHTML files, images, CSS, and a manifest, all zipped together.

A brief flirtation with Pages taught me that using an app that was easy to write in would not yield the book I wanted. (Pages support for ePub improved considerably once Apple discontinued iBooks Author, but it’s still not beefy enough for my needs.) Nor did I choose the popular Scrivener, or the expensive inDesign, both of which will export fine ePubs, and in which many of my writer friends have boundless faith. Ditto Calibre and Sigil, which at least appealed to my desire to think of the book as a giant ball of text.

BBEdit in action, displaying the book’s source code in a project window.

The way I published the iOS 7 version of the book and the way I do it today are remarkably similar. I work in BBEdit on the Mac, and Textastic on the iPad, then I run ancient AppleScripts that verify the book against the ePub specification, and finally put the files together as a book. Building my book this way is a little like compiling a program. A big part of the editing process is debugging it.

Learning to Ask for the Sale

To promote the book, I became a regular at accessibility tech conferences, bought party sponsorships and tables on trade show floors (these were less costly than they might sound). I even had gimmicks, like business cards with QR code stickers on the back that were easy to scan, even if you couldn’t see them.

I used Twitter extensively, and embarked on a podcast book tour – going on any show that would have me to hawk the book. And you know what? People did have me on their shows – there seemed to be real enthusiasm for what I had made.

Acquired Wisdom

Every couple of years I re-examine my publishing process. I’ve made refinements, like adding shortcuts that help me frame and size screenshots. And because there’s demand for a PDF version, I now build an accessible one based on a Microsoft Word template.

At this point, iOS and the book itself are mature propositions. Like the rest of the software, the accessibility tools in iOS have gained incremental updates each year, but the fundamentals remain, meaning not everyone needs either a new phone or a new book each year.

But people do keep buying it and telling me how valuable it has been to them. So chances are pretty good there will be a tenth edition sometime late this fall.

[Shelly Brisbin is a radio producer, host of the Parallel podcast, and author of the book iOS Access for All. She's the host of Lions, Towers & Shields, a podcast about classic movies, on The Incomparable network.]


By Shelly Brisbin

TV for Picky Eaters

Tcm appletv

I’ve noticed that folks who write about cord-cutting tend to be maximalists: “How can I get the most of all the things available?” Fair enough. A lot of people like lots of TV.

But I am not them. I am but a humble fan of classic moves with a less-than-12-hours-per-week TV habit. I’m also budget-minded, which some people might call “cheap.” Since we cut the cord a few years ago, I’ve simply done without access to TV, outside of streaming services. But recently, I’ve been on a journey to figure out how I can get the TV morsels I want, at a reasonable cost.

What I care about is the back catalog—classic movies, arthouse fare, restorations, and the odd vintage TV show. And live news. In that last interest, I’m not alone. Live news and sports are big reasons people stick to cable, or add an over-the-top service to their lives. The leading provider of classic movies is Turner Classic Movies (TCM). It’s not simply that TCM plays films from the 30s forward, but that what you find there is often not available on streaming services, or even on physical media. I dig both the volume and the rarity of what TCM does.

What I Really Really Want

In December, I decided to pull the trigger on an over-the-top service. I had two primary TV goals: bring TCM back into my house, and keep monthly subscription costs below the approximately $90 a cable subscription would have set me back.

And so, I limited my search (Suppose is Jason’s go-to tool for this) to services that offered TCM. You can get the channel is as part of YouTube TV, Hulu Live, DIRECTV Stream, and Sling TV. The first three will sell you a big collection of channels, including TCM, for between $65 and $70 per month. So far, their advantage over a cable subscription – for my purposes – is the ability to watch on lots of platforms. They do each offer local channels, which is a nice bonus.

Sling TV is another kettle of fish, and the one I ended up choosing. Like the others, Sling gives you a fixed basket of channels plus add-on channels, plus DVR. My package, including TCM, costs $45 per month. Here’s how I did it.

To get Sling, you must either buy Sling Orange, promising 30+ channels, or Sling Blue with 40+. There is some overlap, but Orange has more sports and Blue, more news. Each package is $35, or you can buy the lot for $50. I picked Sling Blue, and along with news, I got lifestyle and some movie-focused channels. But not TCM. It’s not part of Sling Orange or Blue—but you can add it with the $6 per-month Hollywood Extra package. I also got FXM, CinéMoi, the Sundance Channel and six more movie-focused offerings.

But here’s the thing: I got way more than that, both in the base package and the Hollywood extra. My channel guide lists 130+ channels, including dreadful single-franchise wastes of space, but also delightful surprises like the Film Detective, Heroes & Icons, Shout Factory and lots more.

All this for $41 per month, leaving plenty of headroom if I decide I’d like the 200-hour cloud DVR instead of the free 50-hour one, or if I want to add other special-interest channel packs. A podcast I’ve been doing lately has gotten me interested in Hallmark romance movies, for instance.

One downside of Sling is that access to local channels is limited. I’ve got exactly one.

Eye of the Beholder

Sling1
Sling on iPad.

I’ve loaded Sling onto my Roku box, Apple TV and iOS devices, and added the Tizen-native app on my Samsung TV. With the Blue package, I can watch live TV on up to three simultaneous screens—which is plenty in our two-person household.

I don’t have any option to create profiles for family members. If I had kids, or a spouse who liked to watch motocross and cooking shows, I might be pretty unhappy about that. You can set parental controls, which are locked down by creating a PIN that’s needed to access specific channels or programs with content ratings you want to lock kids out of.

I find Sling’s interface cluttered and busy. You can’t choose which rows of content to show or hide. The Spotlight row, Recommended for You row, and Trending Live row, are all mixed up with my favorites and DVR content. I’m not a fan of algorithmic TV recommendations. Just give me the channels I’ve favorited and the shows I’ve recorded! Or give me a choice to hide recommendations.

That interface is rendered slightly differently on various operating systems. For example, on my old Roku system, I can filter channels by category, filter by favorites, list them alphabetically, or switch from a grid to a row of thumbnails. Those options don’t appear on the Apple TV version.

Not So Accessible

The Sling apps for Apple TV, Samsung Tizen, Roku and iOS work with each operating system’s screen reader features, speaking the interface and content labels aloud. That’s not a given with TV apps, many of which ignore OS accessibility altogether.

Visually, there’s not much you can do to customize the Sling apps. The gray background provides good contrast with the images and white screen text, but there’s no option to change those colors or increase text size, even on the Apple TV, which provides native support for larger text. Sling not only supports closed captioning, as you’d expect, but has a slew of options for customizing the color and style of text.

Can I Keep It?

I have really, really missed TCM. Getting my movie comfort food via Sling is affordable and flexible. Still, the small amount of time I spend watching each week might tempt me to save the $40 monthly.

It’s the quirky surprises, like the Film Detective channel, and serendipitous encounters with more modern movies that tip the scales in favor of staying subscribed. For now.

[Shelly Brisbin is a radio producer, host of the Parallel podcast, and author of the book iOS Access for All. She's the host of Lions, Towers & Shields, a podcast about classic movies, on The Incomparable network.]


By Shelly Brisbin

All along the WashTower

Up-angle view of lighted WashTower washer window

Around my house, 2021 has been a year of durable goods upheaval. The 10-year-old refrigerator gave us a scare, the 14-year-old Prius put us on notice that its battery was soon to need replacing, and the 20-year-old washer/dryer just plain gave out.

So, as supply chain bottlenecks continue to make this a terrible time to be buying a car or major appliance, my spouse and I have been in repair and product research mode for months. The upside is that any replacement machine we buy is bound to bring better technology to our lives.

We’d just like to space it all out a bit, please.

It helps that I’m married to a person capable of hacking or fixing most machines—the fridge and the car are copacetic right now. But the old washer/dryer’s ills proved too much, even for him. Last month, we took delivery of a Wi-Fi-enabled, app-controllable LG WashTower—a washer/dryer unit that fits snugly into a narrow, low space in our utility room, and promises a “smart” laundry experience, from recommending settings based on the kind of clothes you put into it, to sending you a notification when your sheets are done. I’ll confess right now that the two nerds in this house have had a lot of fun doing laundry these past few weeks.

The hardware

The WashTower is a one-piece front-loading unit that entered the market to some hype earlier this year, and it was among the few options that would fit in our utility room. Despite that pesky old supply chain, we got ours delivered within a week of spotting it at a local Home Depot.

The all-LCD control panel is in the middle, between the washer and dryer tubs. The display is extremely easy to read, and aside from some icons whose meanings are a little obscure at first, setting up and running the machine is straightforward, with lots of presets for common laundry loads, and the promise that the machine can use the weight, and even the texture of fabrics, to recommend the best settings.

closeup of WashTower LCD

The machine feels solid, and the doors close with a satisfying thunk. There’s a soap and fabric softener dispenser drawer above the washer unit and the tubs light up when it’s time to take out your clothes. The tinkly alert sounds it plays remind me more of my rice cooker than the harsh buzz of my mom’s old top-loader. The whole affect is sturdy, modern and sleek.

The LG ThinQ app

LG sells lots of smart devices, from dishwashers and thermostats to TVs and microwaves – all of them use the same app, called LG ThinQ. Though the app can control a range of gadgets, the screenshots I’ve seen of the TV interface, for example, give the impression that LG is smart about building interfaces that match what you’re doing, rather than trying to force all devices to awkwardly share a few similar screens. Though simple, ThinQ’s WashTower interface feels as though it’s designed just for managing laundry machines.

Like a lot of network-enabled devices, you connect your phone to the WashTower’s own Wi-Fi network for initial setup, then move the machine to your network, after which you to connect to it from any device running the app. My first stab at setup was a little rough – the WashTower kept disconnecting from our network when we turned it off. So I rebooted the Wi-Fi router, turned on the WashTower again and reinstalled and reconnected the app. We haven’t had any connection issues in several weeks.1 I can even turn each unit on remotely.2

Left: Normal cycle settings, center: Wash status, right: Standard and cloud presets
Left: Normal cycle settings. Center: Wash status. Right: Standard and cloud presets.

From the app, I can also choose wash and dry cycles, view their status and ETA, and get a notification3 – there’s only one – when a load is done. Unfortunately, that notification doesn’t specify which function is finished.4 When a dryer load is done, I want to remove and fold the clothes quickly to avoid wrinkling. When the washer finishes first, there’s less reason to take scurry to the other end of the house, especially if the dryer’s still running. The single notification means I’ve got to open the app to see what’s going on.

Of course, you don’t need to wait for a notification to check on your laundry. You can use the app to find out what the unit is doing at any point, and how long the current cycle has to run. You can pause or stop the machine if you like, too.

When starting a load of laundry, you can use the app to do just about anything you can do on the unit’s LCD display: choose the type of load, desired temperature, rate of spin and rinse time, or just use one of the many presets. There’s also a “Download” preset that you can set to one of a large number of additional presets listed in the app. (You can tag favorite cycle types, including both built-in and downloadable ones—but confusingly, you can only download one at a time.)

Outsmarting the appliance

One drawback to a machine that thinks it’s smart enough to guess what settings your laundry needs is that it’s not always easy to understand what the WashTower is doing or why. When we first got the unit, everything we washed using presets was done in hot water, even in the rinse stage. That’s not what we wanted.

We sabotaged another WashTower feature, at least a little bit. When you wash a load of laundry, the dryer picks up the washer settings you used and offers them up when you press the dryer’s power button. That’s great, except that we often double up, drying the two most recent wash loads together.

Instead of randomly pairing blue things and brown things because the small loads are about the same size, I’m now starting to think more about which pairs of loads will dry best together on the same WashTower settings. That probably would have been a good idea with the old washer, but the variety of options and the sheer “smartness” of the WashTower has made me a more mindful clothes wrangler.

A few words about accessibility

I write a lot about making technology more accessible to people with disabilities. How well appliances do that is a frequent concern for potential buyers. Like a lot of smart gadgets, the availability of an app is what makes the WashTower a reasonable choice for someone who can’t see or press buttons on an LCD display. The LG ThinQ app is accessible to VoiceOver on iOS. The display is bright, with large text and extremely good contrast, for those with low vision. And the location of the display puts it in easy reach for wheelchair users.

But how well does it wash and dry?

Smart features aside, we’ve been happy with the WashTower’s results. If you’re looking for a proper performance review, a couple of outlets have put it through its paces.


  1. Some users have reported LG washers failing to talk to eero Wi-Fi access points. Your editor has an LG washer and experienced this issue. He ended up turning on his fiber router’s built-in Wi-Fi and connecting the washer to that. —J.S. 
  2. There’s a HomeBridge plugin that will put your devices on your HomeKit network, but its utility is limited. —J.S. 
  3. There’s nothing like getting a vibration on your Apple Watch telling you that it’s time to move the laundry. -J.S. 
  4. This is only an issue on combination units. -J.S. 

[Shelly Brisbin is a radio producer, host of the Parallel podcast, and author of the book iOS Access for All. She's the host of Lions, Towers & Shields, a podcast about classic movies, on The Incomparable network.]



Search Six Colors