Six Colors
Six Colors

by Jason Snell & Dan Moren

This Week's Sponsor

Kolide ensures only secure devices can access your cloud apps. It's Device Trust for Okta. Watch the demo today!

By Dan Moren

Unexpected benefits (and shortcomings) of Apple’s new Live Text feature

There are plenty of whiz-bang features in Apple’s upcoming OS updates, but to my mind, Live Text is the one poised to fundamentally change our interactions with technology. Once upon a time pictures were pictures and text was text, but now that boundary has been blown away; it’s time to rethink a lot of our assumptions.

Jason’s already documented how “useful” Live Text can be when interpreting handwritten recipes, but just in the handful of weeks that I’ve been using beta software on my iPhone, iPad, and M1 MacBook Air, I’ve already found a few unexpected applications of the technology (and at least one missed opportunity).

Low-tech definition

Reading ebooks has spoiled me. No, not because of the ability to cram a 1000-page epic tome onto a device the size of a pamphlet. Not even because of the ability to download books onto without leaving the comfort of my couch.

It’s the definitions. It may surprise, shock, and otherwise stagger you that I’m a bit of a language nerd, but there’s no better feature for me than being able to tap and hold on an unfamiliar word and have a definition presented.1

Recently, however, I requested a book from the library that was only available in hard copy. The horror! While I do enjoy reading paper books, this particular title happened to be rife with unusual words that I’d either never encountered or couldn’t remember. But no tapping for definitions for me! Sure, I suppose I could have simply typed the words into my phone to look them up, but it also occurred to me that this was the perfect place to use Live Text.

Live Text for paper books

So, instead, I pointed my iPhone’s camera at the page and tapped the Live Text button. Without even having to take a photo, I was able to highlight the word in question and tap iOS’s Look Up button to get the definition. No wading through Google searches or scrolling through Spotlight to find the Dictionary section. It may not be that much faster, but it has definitely decreased my cognitive load, and I found myself using this approach several times throughout the book.

I’ll add that this also works a treat on menus. We’ve all spotted a food we’ve never heard of and wondered “Do I want to eat that?” Well, pull out your phone, point it at the menu and tap on the word to look it up and discover that it’s delicious and, yes, you do want to eat it.

Photographic memory

A few years ago, Apple’s Photos app added the ability to search for specific items—dogs, for example, or cars. Now, with the advent of Live Text, you can search for specific text inside a photo. This is great if, for example, you can’t remember when you took that picture of that new restaurant.

Recently I was on vacation and had to call a store about an order I’d placed. This being the Dark Ages, the order had been placed in person and I had only a paper receipt. So before I went on vacation, I snapped a picture of that receipt. Only problem was it was mixed in with a bunch of screenshots I’d been taking for a freelance piece, so it didn’t exactly pop out when I scrolled back through my Photo Library.

Live text in Photos
Puzzlingly, searching for text in your photos works in Spotlight, but not in the Photos app.

Live Text to the rescue, I figured: I could search for the name of the store on the invoice and it ought to show up. But when I tried it, no dice: Photos told me there were no images that matched my search.

So I complained about this in our very own Six Colors slack, and eagle-eyed reader Mihir pointed out that searching for text in Spotlight does surface photos that contain those words.

Perhaps this is just an oversight, a bug, or something that Apple hasn’t implemented yet, but it seems puzzling. Gratified as I am the this functionality exists, it would never occur to me to search for a photo in Spotlight. It’s a tremendously useful feature, and here’s hoping the continuing beta process puts it where it belongs: in the Photos app.

Comincaptcha

So, yes, Live Text has its handy uses, but it also has some pretty big potential implications. Take CAPTCHAs, for example. We’re all familiar with these insidious tests to prove our humanity when logging on to websites.

Live Text and CAPTCHAs

While a lot of places online have switched to Google’s reCAPTCHA system that—which is really designed to train the company’s autonomous driving system2—there remain some sites that rely on other methods, including strangely formatted text that’s hard for a person to read, but supposedly impossible for a computer.

Or, at least, it was. While testing out another feature of iOS 15, I discovered that Live Text can sometimes now understand said strangely formatted text. Tapping on the CAPTCHA let me select the text, so it clearly recognized it as letters, and in some caess, it correctly parsed the text as well.

Which, okay, great for those of us who have trouble sussing out what those weird squiggles are supposed to be, but also not particularly great because now computers are helping us prove our humanity, which seems to kind of obviate the point of these tests in the first place. (Granted, as Live Text didn’t nail the CAPTCHA 100 percent of the time, there’s still hope for us humans—at least for now.)

Given that machine learning models have also gotten better at identifying the items in images—in large part because we have trained them to recognize those items—it seems as though the effectiveness of CAPTCHAs is on the verge of diminishing. So what next: do we have to push these tests on to something else in an ever-escalating arms race? Or perhaps every website will start requiring tests that ask us why we haven’t helped a tortoise lying on its back.


  1. Much as I miss that truly enormous Webster’s dictionary we had in my house growing up. I can still remember the smell of it. 
  2. Come on: crosswalks? Bicycles? Stop lights? Fire hydrants? All things you don’t want your autonomous cars driving into or through. 

[Dan Moren is the East Coast Bureau Chief of Six Colors. You can find him on Mastodon at @dmoren@zeppelin.flights or reach him by email at dan@sixcolors.com. His latest novel, the supernatural detective story All Souls Lost, is now available for pre-order.]


Search Six Colors