Six Colors
Six Colors

by Jason Snell & Dan Moren

This Week's Sponsor

Magic Lasso Adblock: YouTube ad blocker for Safari


By Shelly Brisbin

SearchLink removes the drudgery from making web links

My classic movie podcast, Lions, Towers & Shields over at The Incomparable, uses a standard show-notes format. On each episode, we talk about one old movie, and I always link to places where you can stream it, buy physical media, and learn more about the film we’re watching. Even after I came up with a standard Markdown template I liked, I found myself doing a whole lot of tinkering, mostly Web searches, for each episode’s notes. So I decided to automate the process. I now have a macOS workflow that saves me lots of time.

My TextExpander snippet contains Markdown for my show notes, including SearchLink codes and fill-in fields for the movies.

I initially dropped my Markdown template for the show notes into a TextExpander snippet with a fill-in field for the movie title. I put in placeholders for all my links and their labels. I’d expand the snippet into a Drafts note, launch a browser, and open a folder full of the bookmarks I needed to perform my searches – IMDB, YouTube, Amazon, the past LTS catalog.…

This is a post limited to Six Colors members.


By Shelly Brisbin

Apple Maps’ missing transit link

Apple Maps triptych
Here’s the same location, shown in Explore, Driving, and Transit modes in Maps. Each screen is a bit zoomed in, and they’re all shown in Dark Mode. Note the low-contrast gray-on-gray of the Transit view.

I am a frequent transit user — or I was before the pandemic. That distinction is important because, on a pair of recent trips, I came back into content with transit and Apple Maps in a way that left me scratching my head. Was it like this before?

Maps image of walking on a dotted line.
Here’s what you see when you plot a transit trip and get off the bus a few blocks from your destination. Not terrible, but a turn-by-turn option with higher contrast text would be better.

When I’m in Austin, Texas, where I live, I use the excellent Transit app to find bus and train connections. Mostly, I’m looking for departure times and connections. I don’t really need turn-by-turn directions, because I know where the Republic Square station is, thank you very much.…

This is a post limited to Six Colors members.


By Shelly Brisbin

OmniFocus gains cross-platform voice automation

The latest update to the Omni Group’s OmniFocus task manager has a mild-mannered version number – 3.13 – but it includes one pretty big new feature. You can now control the app (on iPhone, iPad, or Mac) using your voice, with an assist from JavaScript-based Omni Automation features and the Voice Control accessibility feature that’s built into macOS and iOS.

Voice Control is an accessibility feature designed for those with physical disabilities, like motor delays that make it difficult to use gestures or a keyboard. Once Voice Control is enabled, you can control the Mac or iOS interface and dictate and correct text. It goes well beyond what you can do with Siri. Voice Control works inside apps, too, but OmniFocus has beefed up that support with hooks to its own automation scheme.

With Voice Control enabled, and a set of scripts provided by Omni, you can speak commands to create OmniFocus tasks, update or refer them, or hear which tasks are due, among many other options.

You can also create custom commands, or use Voice Control commands to trigger shortcuts.

Omni says Voice Control support is coming to its other apps, beginning later this year. Besides OminFocus, the company’s productivity tools include OmniOutliner, OmniPlan and OmniGraffle.

The OmniFocus 3.1.3 update is free to subscribers. Voice Control requires macOS Monterey or later, or iOS 15 or later. Omni has posted a set of Voice Control automation demo videos.

[Shelly Brisbin is a radio producer, host of the Parallel podcast, and author of the book iOS Access for All. She's the host of Lions, Towers & Shields, a podcast about classic movies, on The Incomparable network.]


By Shelly Brisbin

Apple’s Accessibility feature preview gets #GAAD going

For the second year running, Apple has offered a preview of updated accessibility features coming to its platforms later this year. The announcements come just ahead of Thursday’s Global Accessibility Awareness Day, which goes by #GAAD online.

The preview is notable for spotlighting features that most people won’t use, but that matter a lot to those with disabilities. It’s also notable because it’s a departure from the company’s typical close-to-the-vest approach to what’s coming on the roadmap.

Here’s a look at what Apple announced, and why it matters:

Door Detection. LIDAR-equipped iPhones and iPads have had a feature called People Detection since iOS 13. Using the Magnifier app and the VoiceOver screen reader, a blind user can learn whether the device camera sees a person, and where that person is in relation to the device. That was handy for social distancing. Door Detection will use the same mechanism to alert you when the device identifies the presence of a door. That’s a more practical use of LIDAR for many blind and low-vision users than even People Detection, both indoors and out. Door Detection can tell you about the door’s attributes, including whether it’s open or closed and if there’s a doorknob. Apple says you’ll also be able to read signage, like a room number or a notice on the door. I presume that’s just an application of Live Text, but it’s a great companion for Door Detection.

The use of LIDAR in accessibility has always felt like a bit of a preview of what we might see in future Apple glasses or headset, and it’s encouraging for users of accessibility features that the company is potentially taking their needs into account as it develops future wearables. My hope is that LIDAR sensors, only available in the highest-end phones and iPads, will come to more of the iOS product line. For a blind user who doesn’t buy a phone based on high-end camera features, doing so just to get access to LIDAR-based accessibility features is a tough sell.

Live Captions. Apple joins Google, Microsoft and Zoom, among others, in offering live captions, but they’re global on iOS and macOS, so you can use them in any app with audio output. That’s the superpower here. Just pick up your device and enable captions, whatever you’re doing. Deaf and hard-of-hearing people often bemoan the state of auto-generated captions, so some testing will be warranted.

Watch accessibility improvements. Last year’s May accessibility preview, which I covered on my podcast, Parallel, brought AssistiveTouch the the Apple Watch. It’s a longstanding iOS feature that provides a simplified way to perform gestures for those with motor disabilities. This year, there are more watch gestures, called Quick Actions, and a new feature called Apple Watch Mirroring.

If you have a motor disability, Quick Actions gives you the choice to make hand gestures instead of manipulating the watch screen. An all-purpose “double-pinch” gesture will answer a call, control media playback, or take a photo. Mirroring is like AirPlay for the Apple Watch, sending the watch screen to your phone. That’s also useful for people with motor disabilities who can more easily use the phone screen that the smaller, more inconveniently-located watch face.

I’m intrigued by the possibilities for low-vision users, too, because the phone screen is sometimes far easier to use at close range and in zoomed mode than the watch. And you can use AssistiveTouch or Switch Control, if that’s how you interact with your phone.

Buddy Controller. Turn two game controllers into one, so two people can play a game together, with one able to assist someone with a physical disability who has difficulty manipulating some or all of the controller’s features.

Siri Pause Time adjustment. If your speech is slow or impaired, having a little extra time to complete your thought before Siri tries to act on it could make it a more useful tool.

Customization for Sound Recognition. Introduced in iOS 13, Sound Recognition allows your device to listen for sounds in your environment, like water running, a baby crying or a siren, and then notify you with a visual alert. It’s a useful tool for getting the attention of someone who’s deaf or hard of hearing. But you’re currently limited to one of 15 sounds. It’s a good list, but what if a sound you needed to know about isn’t on the list? Apple says that later this year, you’ll be able to record and save sounds you’d like to use with Sound Recognition. (Perhaps you have a unique doorbell or an appliance with a special trill?) Customization probably should have been part of Sound Recognition to begin with, but it’s common for Apple to roll out a totally new accessibility feature, then build its capabilities over time.

Apple detailed a few other new features on Tuesday, including 20 more languages for the VoiceOver screen reader and Voice Control Spotlight mode, which you can use to dictate custom word spellings.

Big Deal or No Big Deal?

This is a nice grab bag of features, with Door Detection and the Apple Watch updates offering the most intriguing possibilities. It’s also possible there are more where these came from, as occasionally happens when the late betas start to become available.

[Shelly Brisbin is a radio producer, host of the Parallel podcast, and author of the book iOS Access for All. She's the host of Lions, Towers & Shields, a podcast about classic movies, on The Incomparable network.]



Search Six Colors