Though I’ve just cut the cord and dumped traditional cable TV, the truth is, I’m cheating. I’ve subscribed to what’s called an “over the top” TV service, which provides a bundle of live TV channels—essentially, it’s a cable TV plan that’s done entirely via streaming. This has required me to replace my venerable TiVo with an app that runs on my Apple TV.
The service is Fubo TV, for which I’m paying $60 a month. I can’t give up live TV, mostly because of sports, so it’s fitting that I ended up with Fubo TV—it originated as a sports streaming service, though it now also carries local channels and cable entertainment channels.
I chose Fubo TV because of its channel lineup, which included all the sports channels I required. (YouTube TV came close, and I intend to test-drive YouTube TV in the future and write about it here when I do.) But channels are only part of the equation when it comes to TV service: There’s the TV interface itself. How can an app measure up to an old-fashioned TiVo box attached to a cable plan?
The short answer is, it’s a step back in terms of functionality, with a lot of room for improvement. It’s usable, but can be frustrating. Let’s get into it.
The tech we can’t seem to let go of, what annoyances we’d fix with $50, our feelings on smart TVs, and how automated photo-surfacing apps work (or don’t) for us.
There are plenty of whiz-bang features in Apple’s upcoming OS updates, but to my mind, Live Text is the one poised to fundamentally change our interactions with technology. Once upon a time pictures were pictures and text was text, but now that boundary has been blown away; it’s time to rethink a lot of our assumptions.
Jason’s already documented how “useful” Live Text can be when interpreting handwritten recipes, but just in the handful of weeks that I’ve been using beta software on my iPhone, iPad, and M1 MacBook Air, I’ve already found a few unexpected applications of the technology (and at least one missed opportunity).
Low-tech definition
Reading ebooks has spoiled me. No, not because of the ability to cram a 1000-page epic tome onto a device the size of a pamphlet. Not even because of the ability to download books onto without leaving the comfort of my couch.
It’s the definitions. It may surprise, shock, and otherwise stagger you that I’m a bit of a language nerd, but there’s no better feature for me than being able to tap and hold on an unfamiliar word and have a definition presented.1
Recently, however, I requested a book from the library that was only available in hard copy. The horror! While I do enjoy reading paper books, this particular title happened to be rife with unusual words that I’d either never encountered or couldn’t remember. But no tapping for definitions for me! Sure, I suppose I could have simply typed the words into my phone to look them up, but it also occurred to me that this was the perfect place to use Live Text.
So, instead, I pointed my iPhone’s camera at the page and tapped the Live Text button. Without even having to take a photo, I was able to highlight the word in question and tap iOS’s Look Up button to get the definition. No wading through Google searches or scrolling through Spotlight to find the Dictionary section. It may not be that much faster, but it has definitely decreased my cognitive load, and I found myself using this approach several times throughout the book.
I’ll add that this also works a treat on menus. We’ve all spotted a food we’ve never heard of and wondered “Do I want to eat that?” Well, pull out your phone, point it at the menu and tap on the word to look it up and discover that it’s delicious and, yes, you do want to eat it.
Photographic memory
A few years ago, Apple’s Photos app added the ability to search for specific items—dogs, for example, or cars. Now, with the advent of Live Text, you can search for specific text inside a photo. This is great if, for example, you can’t remember when you took that picture of that new restaurant.
Recently I was on vacation and had to call a store about an order I’d placed. This being the Dark Ages, the order had been placed in person and I had only a paper receipt. So before I went on vacation, I snapped a picture of that receipt. Only problem was it was mixed in with a bunch of screenshots I’d been taking for a freelance piece, so it didn’t exactly pop out when I scrolled back through my Photo Library.
Puzzlingly, searching for text in your photos works in Spotlight, but not in the Photos app.
Live Text to the rescue, I figured: I could search for the name of the store on the invoice and it ought to show up. But when I tried it, no dice: Photos told me there were no images that matched my search.
So I complained about this in our very own Six Colors slack, and eagle-eyed reader Mihir pointed out that searching for text in Spotlight does surface photos that contain those words.
Perhaps this is just an oversight, a bug, or something that Apple hasn’t implemented yet, but it seems puzzling. Gratified as I am the this functionality exists, it would never occur to me to search for a photo in Spotlight. It’s a tremendously useful feature, and here’s hoping the continuing beta process puts it where it belongs: in the Photos app.
Comincaptcha
So, yes, Live Text has its handy uses, but it also has some pretty big potential implications. Take CAPTCHAs, for example. We’re all familiar with these insidious tests to prove our humanity when logging on to websites.
While a lot of places online have switched to Google’s reCAPTCHA system that—which is really designed to train the company’s autonomous driving system2—there remain some sites that rely on other methods, including strangely formatted text that’s hard for a person to read, but supposedly impossible for a computer.
Or, at least, it was. While testing out another feature of iOS 15, I discovered that Live Text can sometimes now understand said strangely formatted text. Tapping on the CAPTCHA let me select the text, so it clearly recognized it as letters, and in some caess, it correctly parsed the text as well.
Which, okay, great for those of us who have trouble sussing out what those weird squiggles are supposed to be, but also not particularly great because now computers are helping us prove our humanity, which seems to kind of obviate the point of these tests in the first place. (Granted, as Live Text didn’t nail the CAPTCHA 100 percent of the time, there’s still hope for us humans—at least for now.)
Given that machine learning models have also gotten better at identifying the items in images—in large part because we have trained them to recognize those items—it seems as though the effectiveness of CAPTCHAs is on the verge of diminishing. So what next: do we have to push these tests on to something else in an ever-escalating arms race? Or perhaps every website will start requiring tests that ask us why we haven’t helped a tortoise lying on its back.
Much as I miss that truly enormous Webster’s dictionary we had in my house growing up. I can still remember the smell of it. ↩
Come on: crosswalks? Bicycles? Stop lights? Fire hydrants? All things you don’t want your autonomous cars driving into or through. ↩
[Dan Moren is the East Coast Bureau Chief of Six Colors, as well as an author, podcaster, and two-time Jeopardy! champion. You can find him on Mastodon at @dmoren@zeppelin.flights or reach him by email at dan@sixcolors.com. His next novel, the sci-fi adventure Eternity's Tomb, will be released in November 2026.]
Apple’s out-of-the-blue announcement last week that it was adding a bunch of features to iOS involving child sexual abuse materials (CSAM) generated an entirely predictable reaction. Or, more accurately, reactions. Those on the law-enforcement side of the spectrum praised Apple for its work, and those on the civil-liberties side accused Apple of turning iPhones into surveillance devices.
It’s not surprising at all that Apple’s announcement would be met with scrutiny. If anything is surprising about this whole story, it’s that Apple doesn’t seem to have anticipated all the pushback its announcement received. The company had to post a Frequently-Asked Questions file in response. If Q’s are being FA’d in the wake of your announcement, you probably botched your announcement.
Such an announcement deserves scrutiny. The problem for those seeking to drop their hot takes about this issue is that it’s extremely complicated and there are no easy answers. That doesn’t mean that Apple’s approach is fundamentally right or wrong, but it does mean that Apple has made some choices that are worth exploring and debating.
This is a good interview between Matthew Panzarino of TechCrunch and Erik Neuenschwander, head of Privacy at Apple, about Apple’s announced child-abuse-related features. A highlight:
We have two co-equal goals here. One is to improve child safety on the platform and the second is to preserve user privacy. And what we’ve been able to do across all three of the features is bring together technologies that let us deliver on both of those goals.
You can see this in how the features were designed—Apple is trying to balance these two goals. (How well it did remains the subject of discussion.) Not all of Neuenschwander’s answers are entirely satisfying, but they’re still informative about how Apple is approaching this issue.
But let’s hold our focus on the most important point of inquiry: What’s the deal with the cards? When Bogost looked into them in May, a historian at the CDC guessed that their design was likely inherited, but no one seemed to know from what. “Like so much of our vaccine rollout, I’m guessing someone had to produce this in, like, eight hours,” Buttenheim said. “There was not time to workshop it and focus-group it and pressure-test it and rapid-cycle prototype it.”
Mull also points to issues trying to make the existing vaccine cards work with electronic systems:
New York City’s smartphone verification app—not to be confused with New York State’s Excelsior Pass, or its new Excelsior Plus Pass—appears to accept photos of restaurant menus as proof of vaccination. A spokesperson for Mayor Bill de Blasio has said that’s because the city’s app doesn’t verify anything; it simply gives users a place to store a photo of their vaccine card.
We discuss Apple’s multiple announcements related to child safety, including what prompted Apple’s actions, the different ways any technological tool can be used, where Apple has chosen to intervene, and the dangers of sliding down a slippery slope. In lighter news, we also talk about Apple’s rediscovery of its online store and various attempts for streaming services to build new franchises. Also, alert Broadway and the West End: we may have invented a new segment.
The Cyber@BGU team—consisting of Ben Nassi, Yaron Pirutin, Tomer Gator, Boris Zadov, and Professor Yuval Elovici—analyzed a broad array of widely used consumer devices including smart speakers, simple PC speakers, and USB hubs. The team found that the devices’ power indicator LEDs were generally influenced perceptibly by audio signals fed through the attached speakers.
Although the fluctuations in LED signal strength generally aren’t perceptible to the naked eye, they’re strong enough to be read with a photodiode coupled to a simple optical telescope. The slight flickering of power LED output due to changes in voltage as the speakers consume electrical current are converted into an electrical signal by the photodiode; the electrical signal can then be run through a simple Analog/Digital Converter (ADC) and played back directly.
Wild.
This isn’t something that most people need to worry about, thanks to several reasons outlined in the piece (the need for line of sight, the inability to capture any local audio, and so on), but it’s still kind of amazing to think about.
Last week, Google took the wraps off its upcoming Pixel 6 line of smartphones—though, in very Google fashion, it didn’t spill all the details about the devices, just dropped enough to tease consumers, with a promise that more would be shared this fall.
But even that brief look clues us in a little bit about the state of smartphones in 2021. Google may not, thus far, be a maker of top-tier flagship smartphones (a market it’s largely ceded to device makers like Samsung), but as the steward of the Android platform, it obviously has a vested interest in its success, so it’s no surprise that Google’s playbook is taking notes from Apple—though it’s certainly not a one-way street.
Google is probably hoping to steal some thunder from Apple with its Pixel announcement, given that the annual announcement of the latest iPhone is likely due next month. While that strategy may or may not pay off, it does mean that we can probably start to pinpoint where these two major players have decided to put their energy and resources, which in turn gives us a peek at where the future of the smartphone market is headed.
Rich Mogull and Glenn Fleishman have a detailed look at Apple’s announcement of multiple software changes involving photos and illegal material:
Apple’s announcement headlined these changes as “Expanded Protections for Children.” That may be true, but it could easily be argued that Apple’s move jeopardizes its overall privacy position, despite the company’s past efforts to build in safeguards, provide age-appropriate insight for parents about younger children, and rebuffed governments that have wanted Apple to break its end-to-end encryption and make iCloud less private to track down criminals (see “FBI Cracks Pensacola Shooter’s iPhone, Still Mad at Apple,” 19 May 2020).
You may have a lot of questions. We know we did. Based on our experience and the information Apple has made public, here are answers to some of what we think will be the most common ones.
Very much worth a read. This is a complicated and troubling topic.
The return of our pandemic tech setups, our thoughts on Citizen’s premium Protect offering, whether we’ve replaced complicated tech with something less complicated, and how we do — or don’t — stream audio throughout our homes.
Earlier this week, Apple announced a range of new GPU options for the Mac Pro, adding support for AMD’s RDNA2 architecture via its own MPX module format. All three of the new options are overkill for my uses, so I’ll be sticking with my Radeon Pro W5700X, which was the first additional GPU offered by Apple beyond the options that originally shipped with the machine.
Since the Mac Pro’s late 2019 launch, Apple has also added options for 8 TB of storage, not to mention the parts that let a user switch from feet to wheels and back again.
All in all, there are almost two dozen components on Apple’s online store that can be installed inside the Mac Pro, including GPUs, SSD modules, cables, drive enclosures, and RAM kits.
The ability to upgrade a machine over time is exactly why some users are drawn to the Mac Pro—and one reason the 2013 model was such a dud. No, the Mac Pro isn’t as open as the old-school Power G3 and G4 towers, but even this level of upgradability isn’t present anywhere else on the Mac.
This hardware is not cheap—not even close—but the flexibility is there if you’re willing to pay for it.
I’m encouraged to see Apple still putting out new parts for this Mac, and not only because one is silently doing its thing under my desk. When Apple announced the Mac Pro, it was making a promise to high-end users that the company wasn’t going to ghost them again as it had with the previous model. Apple made that commitment while knowing that Macs running Apple silicon were just around the corner.
I hope this newfound willingness to support users with high-end and esoteric needs continues into the Apple Silicon era. The time of Intel Macs and the 2019 Mac Pro is inevitably drawing to a close, but that doesn’t mean that the period of Apple offering high-end Mac users a computer with plenty of upgrade options has to end.
[Stephen Hackett is the author of 512 Pixels and co-founder of Relay FM.]
The iPad Pro has frequently been an incubator for technology that Apple ultimately plans on rolling out to the rest of its product line. Last year, the iPad Pro got a LiDAR scanner months before it appeared in the iPhone 12. This year’s 12.9-inch model introduced the mini-LED screen technology that will probably be showing up very soon in a new line of MacBook Pro laptops. And this fall’s iPhone Pro models are rumored to come with high- refresh-rate displays, pioneered years ago on the iPad as ProMotion.
But there’s another core Apple technology of the future that’s currently available only on the 2021 iPad Pro. And I’m confident that, in the next couple of years, you’ll see it spread across most (but not all) of Apple’s products: Center Stage.
Center Stage uses machine-learning technology to pan and zoom in a camera’s field of view to get the perfect shot during a FaceTime call or other videoconference. It will zoom in on a single subject, or zoom out to find every person in the frame. If you haven’t tried Center Stage, you’ll need to trust me: It’s great. And having experienced it for months on my iPad Pro, I now want it everywhere. It’s too good a feature not to be, and as soon as possible.
It’s not dead yet! On Tuesday, Apple rolled out three new graphics card modules for the Intel-based Mac Pro, all based on AMD’s Radeon Pro W6000 series GPU. (Apple posted a Mac Pro performance white paper to celebrate.) The new modules (in Apple’s MPX format) come in three variants, with a Radeon Pro W6800X, two W6800X GPUs, and the W6900X. Each module also adds four Thunderbolt 3 ports and an HDMI 2 port to the Mac Pro.
The Mac Pro supports two MPX modules, so you could pop in two of the dual-GPU modules to max out performance. They can connect using AMD’s Infinity Fabric Link, which can connect up to four GPUs to communicate with one another via a super-fast connection with much more bandwidth than is available via the PCIe bus.
The new modules are replacing previous modules featuring AMD Vega II GPUs, and will be offered as configure-to-order options for new Mac Pros and as standalone kits. Get ready now: it’s $2800 for the Radeon Pro W6800X, $5000 for the W6800X duo, and $6000 for the W6900X. Each module also adds four Thunderbolt 3 ports and an HDMI 2 port to the Mac Pro. Apple’s also removing the Radeon Pro Vega II and Pro Vega II Duo MPX modules as configurable options with new systems, but will still sell them as standalone kits for $2200 and $4400, respectively.
For those who want the benefits of biometric authentication for their existing Mac, good news: the Magic Keyboard with Touch ID, recently introduced alongside the M1 iMac, is now available for individual purchase. The standard model will run $149; the version with a numeric keypad (and, more importantly, an inverted-T arrow key configuration), $179. Both include a Lightning-to-USB-C cable for pairing, charging, and wired usage.
One reminder: These keyboards are only compatible with Macs powered by Apple silicon running macOS 11.4 or later. (It’s possible that the keyboard functionality will still work with older Macs, but the Touch ID features will certainly not.)
Unfortunately, though the keyboards are available color-matched when sold with iMacs, the individual versions are in the classic silver/white scheme only. So if you want to go the two-tone route, you’ll need to find an iMac owner willing to part with theirs after the fact.
Starting Aug. 10, we will no longer support crossword play on Across Lite. This means we won’t provide downloadable .puz files for use on that platform. You can play the NYT crossword on our Crossword App and on desktop and mobile web.
This decision will limit crossword players to the Times’s own app and website. Previously, you could play it in various third-party apps, including Red Sweater’s Black Ink on Mac and iOS. This decision reduces the choices of paying subscribers and has some accessibility implications, too.