Six Colors
Six Colors

Apple, technology, and other stuff

Support this Site

Become a Six Colors member to read exclusive posts, get our weekly podcast, join our community, and more!

By Joe Rosensteel

Picking up the missing pieces of Apple’s Creator Studio

In October, I wrote a little piece about how I was concerned over the lack of a clear strategy with Apple’s creativity apps—in particular, the recently-acquired Pixelmator and Photomator, as well as the inconsistent development effort behind Apple’s video apps. As speculated and rumored, progress was being held for a pro apps bundle, Creator Studio, but only sorta-kinda.

It’s not the first time Apple has made a bundle for its pro creativity software. People may remember Final Cut Studio, which included Final Cut Pro, Motion, and Compressor (as well as other software). So Apple glued Logic Pro to that. Then they glued Pixelmator Pro to that. Then they said, “Do you know what’s just like these? (Pregnant pause.) Our productivity office applications.” and bundled those in as well, but only freemium beta features. When I think of Final Cut Pro, I certainly think of “extra things I can do in Freeform.”

While it’s positive that people who have paid for apps, or just like iWork, get to continue to use them, the way the features are partitioned between all of these versions makes very little sense, and will probably make even less sense over time. Take Pixelmator, which is supposed to be Apple’s answer to Photoshop. It will still be available to buy in the Mac App Store, but won’t have the new warp tool.

A screenshot of the Apple product page for Pixelmator Pro explaining the function of the warp tool. A curved outline is on a mug.
The only subscribers-only feature.

Sure, that’s the one thing in your Photoshop competitor that requires a subscription, the improved warp tool.

Just one more thing…

What’s the deal with Photomator?

Pixelmator Pro received an update and an iPadOS app, but there are no updates for Photomator, a Lightroom analog, and it is not a part of the Creator Studio bundle. It continues to be a product that you can pay a separate subscription for on a monthly ($8), or yearly (there are three $30 a year plans, and one $40 a year plan, with no clear differences) basis that offers nothing beyond bug fixes.

It seems unlikely Apple is going to kill Photomator, because when Creator Studio was announced, Apple didn’t say anything about it—while it announced that Pixelmator Classic for iPhone wasn’t going to receive any updates.

We’ll get back to Photomator. But first, let me explain how maddening that announcement about Pixelmator Classic was.

Pixelmator Classic was the only Photoshop-like app on the iPhone that was truly Photoshop-like. Adobe’s own iPhone Photoshop apps (there have been many) have all been attempts at reimagining Photoshop for the iPhone, and seemingly aimed at customers who find Photoshop too intimidating. Unfortunately, its solution has just been less elegant than Pixelmator Classic’s.

I pay Adobe for Creative Cloud and don’t use their Photoshop iPhone app. Instead, I use Pixelmator Classic, which is bizarrely being put out to pasture with no imminent or announced replacement. Maybe there will eventually be a Pixelmator for iPhone, and maybe that will eventually be in this Creator Studio bundle. There will simply be no way to know until, and unless, it happens. Apple loves its little surprises!

Why not forecast that possibility by telling us what will happen with the multi-platform app Photomator? It’s the direct analog to Lightroom, making it the most obvious missing piece in Apple’s bundle. If it’s because there are no updates to announce for Photomator after over a year, then I would ask, “Why is Apple charging $30 a year for the existing version of Photomator?”

If it’s because Photomator will instead be a $30 a year freemium unlock for the Photos app, then I would ask, “What’s the Creator Studio bundle for if it doesn’t include photography? And why is Apple still charging $30 a year?”

Let’s say it’s going to make it a separate up-sell for Photos. Then we’ll probably find out in June, but it won’t ship until the fall. Conveniently, that gives me just enough time to start another yearly billing cycle for Photomator, so I will have paid $60 since Apple acquired Photomator and did nothing with it.

I am not arguing that Photomator should be free. Free is unhealthy, because then there’s no motivation to improve the software. I’m arguing that if there’s a subscription fee I’m paying annually, then there needs to be at least annual development of the software. I don’t need massive updates, but I need some sign that there is, and will continue to be, value in paying an annual subscription.

This isn’t software from a small, independent company any longer. It’s now software from Apple, which embarrassingly struggles to release its yearly OS updates for its platforms, and still can’t match its multi-platform apps feature for feature.

There’s something very strange in Apple branding this as a Creator Studio, seemingly targeting independent “content” creators, but then not having a single iPhone app with a video, photo, illustration, image editing, or music specialization when the iPhone is the platform the creatives are the most concerned with as the final destination for their creativity.

That just leads me back to the same conclusion that I drew in October: What is the promise Apple is making by asking for these annual payments? The mismatched nature of the bundle, and Apple’s spotty updates before this, makes me question if it thinks that just continuing things as they were, but with a subscription fee, is good enough.

Needless to say, I’m skeptical of the Creator Studio being a Creative Cloud replacement when Apple can’t even say what its answer to Lightroom is, when they bought their own answer to Lightroom over a year ago.

[Joe Rosensteel is a VFX artist and writer based in Los Angeles.]


By Joe Rosensteel

When you give tech gifts, also give the gift of installation

A smartwatch with a black band hangs from a Christmas tree adorned with lights, beads, and ornaments.

‘Tis the season to be harried. There’s a work thing, a friend thing, some poor sucker has a birthday in December, shopping for food, shopping for presents, donations, shopping for food again because you forgot something… The list of things to do is endless. So when you do get a gift lined up for a loved one, and it’s something from a premium electronics brand like Apple, you might feel like you’ve done your job. The recipient will be so excited to open that new Apple Watch for the holidays, but remember that they may not really be prepared for a multi-part setup process.

I’ll share a slightly cautionary tale of giving myself an Apple Watch Series 11 as a combination Christmas and “It’s too bad my birthday is close to Christmas” gift.

The battery on my Apple Watch Series 7 was no longer lasting the day, and that is the primary reason to upgrade an Apple Watch these days. Whomever you’re gifting an Apple device to is probably in a similar situation, where it’s mostly the battery or physical damage, so it seems like a straightforward gift.

First of all, the new Watch paired with my iPhone just fine, but it needed to download and install watchOS 26.1. That took forever, and it lacked an accurate estimate of when user intervention would be needed again.

Sure, there are several spans of time that are mentioned, but it might as well be a random number generator. I just kept checking my iPhone over and over by unlocking it, and waiting for the interface to refresh to tell me what cryptic step it was on.

That’s a really crummy experience, since the iPhone and the Watch need to be near each other, and the Watch needs to be on a charger. Keep in mind that a charging cable is included with the Watch, but there’s no power adapter. You might want to have a charging block on hand, and potentially the means to keep their iPhone charged, too, if you don’t want to have to keep leaving your holiday celebrations to check on the installation, pairing, and restore process.1

Don’t merely hand a boxed Apple Watch to your loved one before you walk out the door, or they hop on a plane. Part of your gift is this annoying setup.

Second of all, after watchOS 26.1 was installed, the pairing process froze. I needed to back out of it on my iPhone, complete with a dire warning that my Watch would be reset to factory settings. There was little choice, so I resigned myself to it. This got the iPhone in a state where the Watch app said it was unpairing with my Watch for about 10 minutes. Once that was finished, the iPhone and Watch were able to start the pairing process again, but did not have to redownload and install the latest watchOS. In total, this was an hour and a half of my time.

Third, even though it’s supposed to migrate your data and settings from your old Watch, it doesn’t do that in its entirety.

Reauthorizing credit cards for Apple Pay means taking out each credit card and entering the security code information. Also, as it turns out, you need to make sure the default credit card for Apple Pay doesn’t get changed, and confirm that express transit is set to the correct card. Neither was correct for me when I upgraded from Series 7 to Series 11. I spent a couple of days using another card that was the same color as my default card before I realized it was wrong.

There’s no way to skip some of the helpful onboarding dialogs, even if the person is migrating from a recent Apple Watch. My old watch was running watchOS 26.1, but I still got the whole walkthrough about how the Digital Crown works, and the Workouts app still wanted to explain the “new” Workouts app I had already been using. These are minor annoyances that require no guidance from you, but rest assured that Apple just doesn’t care if someone has already gone through these steps.

I have the Tips app set to never, ever, ever give me tips about anything, and yet that was reverted to helpful pings about how Apple Watches work. If the person you’re gifting an Apple Watch to finds these useful, then that’s fine, but if they don’t, they will really appreciate it if you dig into the Watch app’s notifications tab for them.

After the watch was allegedly ready to go, my Modular watch face was missing all of my complications, and there didn’t seem to be any way to force it to reload them. They did appear when I checked again an hour later, but nothing is reassuring about it. If you notice something is missing, then preemptively tell the person that you’ll leave it on the charger for a little bit and wait for it to finish doing some background tasks. Again, adding to that hour and a half to two hours you might have already spent.

The final thing that will spring up on their new Apple Watch are permissions authorizations. Those are not restored from the old watch, and they don’t happen during the setup process. They reveal themselves only if something is invoked that requires those permissions.

For example, when I got into the car and used Siri to pull up directions in CarPlay, my wrist buzzed that Maps wanted access to my location information. It was not safe for me to fiddle with my wrist watch while I was driving on the freeway, so I didn’t get any of the helpful little wrist buzzes for turns. It’s not a huge deal, but maybe just pop Maps open for them before they go out into the world.

Remember that as a technology enthusiast, your gift giving is not the money you spend on the gift, or physically wrapping and handing them a box, but in supporting them to actually enjoy their present instead of being frustrated by some of the technical hiccups. If you’re not ready to go through with helping them set up Apple products, maybe get them some pears from Harry & David instead?


  1. Or maybe you do want an excuse to leave your holiday celebrations. Your secret’s safe with me. 

[Joe Rosensteel is a VFX artist and writer based in Los Angeles.]


By Joe Rosensteel

Pick a car, any car: Adventures with CarPlay

A photo of the center screen and air conditioning stack in a Chevy Trax. The screen displays the CarPlay maps interface. The other side of the windshield is the busy rental return center and awnings.

A lot of car stuff happened this month, where we needed to go from a one-car household to two. We bought a new car, but it wasn’t going to be delivered in time, so I’d need to rent a couple of cars in the meantime.

The thing that made the rentals tolerable was knowing I could rely on Apple CarPlay.

CarPlay is one of the best pieces of software Apple has ever made. It’s a little magic trick where a car’s infotainment system gets a projection of a virtual display generated by your iPhone with all of your audio and navigation apps filled with your data. You don’t download music files to the car or sync location data; it’s just instantly available to you.

It also lives separately from the software your vehicle needs to function safely on the road. The partition of what’s the automaker’s responsibility and what’s Apple’s responsibility is crystal clear because the software from the automaker looks and behaves differently. That’s a feature, not a bug.

When I picked up the 2025 Nissan Altima that smelled like hamburger grease, I was relieved that I didn’t need to use any of its much-older-than-2025 software stack for navigation or media. If anything, that 2018-era display became a window to the best of present-day technology.

Chevy Trax CarPlay screen is wide
The wide CarPlay screen in the Chevy Trax was fun.

The 2026 Chevrolet Trax hasn’t fallen victim to GM CEO Mary Barra’s long-term, anti-CarPlay plans. I couldn’t get wireless Apple CarPlay to work (Mary, is that you?), but the USB cable did just fine, and the screen was more than decent. The way CarPlay reflows and expands to fill a larger screen has greatly improved over the years. iOS 26 has a few issues with button edges getting trimmed by their container, but it generally makes good use of the space.

At no point did I need to create an account with each automaker for each rental car, or log in with credentials for other services. I didn’t need to use Bluetooth for rudimentary media playback for unsupported apps. I didn’t need to read addresses off my iPhone and manually type them into the car’s navigation system.

I did really need those services, too, as I was commuting to an office three days a week and had no idea what traffic patterns would coagulate in the roadways of the Los Angeles metropolitan area. I needed routing, and importantly, a routing system that I knew the ins and outs of. I wasn’t going to learn the quirks and features of software that was only temporarily in my possession.

I also needed a voice assistant, one that was absolutely terrible, but absolutely terrible in a predictable way for the limited types of requests I had while driving and not relearning what commands I needed for assistants. Sharing my ETA to time dinner, or to figure out if I needed to stop on the way home. Not fiddling with some other voice-to-text system that needs to sync my contacts.

Why can’t we all get along?

Some automakers want to reset the relationship they have with customers for services. They will never be able to match CarPlay for personal choice, or data portability—and especially not for context, like what directions were you just looking at on your iPhone before you got in your car.

On a recent episode of The Verge’s Decoder podcast Nilay Patel talked to Mary Barra about CarPlay, and she said CarPlay was confusing to customers. Then her Chief Product Officer, Sterling Anderson, cited Steve Jobs as the reason for their move away from CarPlay, and talked about the possibility of federated IDs for logging into your car.

These are not people interested in replacing CarPlay with a better solution for motorists, just a better solution for GM.

Oddly enough, there’s a rumor that Tesla might add CarPlay support. Men would rather add CarPlay support than go to therapy.

I don’t begrudge Tesla adding CarPlay support merely because I don’t like the company, and especially its CEO. The whole point is that every automaker should have it, so the power and personalization are in the consumer’s hands.1

Ultra, shmultra

That same empowerment of consumers doesn’t extend to CarPlay Ultra. With CarPlay Ultra, Apple is also misunderstanding the balance of the relationship all three parties are in between automakers, consumers, and itself.

People might have forgotten it, but there were a few years where Apple marketed this as “next generation CarPlay” and stalled out development on regular CarPlay. It was the Apple /// successor to the Apple ][‘s CarPlay.2

CarPlay Ultra is a priority for some Apple executives who want their car’s interface to look a certain way. It doesn’t extend to real control of the vehicle. No portability of car settings in multi-driver households (that’s for the automaker’s profile selection screen), or integration with assistive driving tech, etc. Ultra is about making the climate and volume sliders look like Control Center sliders.

Apple’s continued efforts at improving CarPlay in iOS 26 instead of letting it go stale in some effort to push Ultra adoption are a huge relief.

A screenshot of the CarPlay widget screen in iOS 26 showing three widgets in a row. From left to right: Overcast, Home with smart switches, and a photo of a grumpy sea turtle.
Other than the Overcast widget, I haven’t really found much utility with widgets yet.

We’ve got widgets now! I haven’t really found any personal utility in them so far, except the Overcast widget that lets me more easily resume playback of my most recently listened to episode. Still, it’s great that it’s there, and with 26.2 you’ll apparently be able to squeeze in another column of them on certain screens.

It’s been a long road

CarPlay is such a boon that we take for granted. Any attempts to veer further into Apple’s control, or swerve back to automakers, ruin that. Staying between the lines is pretty key to CarPlay’s success.

The ability to literally get up and go with any car can’t be overestimated. Whether that’s my month of rental cars or it’s the reliable, everyday vehicle that someone’s been using for years, it’s worth reflecting on how CarPlay has helped reduce friction in our lives.


  1. If the people with “I bought this before Elon was crazy” bumper stickers also get to use CarPlay as an update, and it makes it easier for them to transition to another CarPlay vehicle the next time they buy a car, then that’s an unintended side effect I’m totally fine with. 
  2. If you’re old enough to get this reference, make sure your eye prescription is up to date before renewing your driver’s license. 

[Joe Rosensteel is a VFX artist and writer based in Los Angeles.]


By Joe Rosensteel

Creative neglect: What about the apps in Apple?

A screenshot of the Photomator for Mac splash screen communicating the acquisition to users.

One of the things that I think about from time to time is Apple’s collection of apps. Some are the crown jewels, like Apple’s pro apps, and others help an everyday consumer to tackle their iLife. All are pretty starved for attention and resources, outside of infrequent updates aligned with showing off the native power of Apple Silicon, Apple Intelligence, or demos of platform integration that never quite get all the way there.

Three things really brought this up to the surface for me recently: The neglect of Clips and iMovie, the radio silence regarding Pixelmator/Photomator, and Final Cut Pro being trotted out for demos but not shipping appropriate updates.

Continue reading “Creative neglect: What about the apps in Apple?”…


By Joe Rosensteel

Apple TV+ gets a new, familiar name

Black Apple TV box with a multicolored Apple logo and 'tv' text.
A vibrant new identity.

As a little addendum to the press release announcing when “F1: The Movie” would be available to stream Apple changed the name of its video streaming service from Apple TV+ to… Apple TV:

Apple TV+ is now simply Apple TV, with a vibrant new identity.

Naturally, the obvious joke that occurred to everyone reading this news was “Apple TV in the Apple TV app on the Apple TV” —or some nearly identical variation on that. How confusing it is that they are all named the same thing!

But they aren’t actually named the same thing. That little black box Apple sells has been named Apple TV 4K for the last eight years. The TV app has a big Apple logo in its icon, but it’s still the TV app, or Apple’s TV app, not “the Apple TV app.”

Of course I wish Apple had never named its service Apple TV+, or any derivation of an existing product, but we’re way past that now. Unless you have a time machine to go back to 20191, there’s no point inventing a completely new name. It’s too late to call it Apple Rainbow or Apple Stream. Everyone already calls it Apple TV. This announcement is just catching up with reality.

This is the company that sells you a MacBook Pro M4 Pro and a MacBook Pro M4 Max. Naming isn’t its strong suit, and this is hardly the most confusing thing it has ever done.

Yes, this is all quite grating for nerds—what a polluted namespace!—but most people are much less focused on these details and it all burs together into Apple TV-ness. Sure, you’ll still need to ask clarifying questions about what someone is referring to when it isn’t obvious from context, but it mostly is obvious:

“F1: The Movie” is coming to Apple TV December 12th.

Is that confusing in any way?

There’s certainly room for improvement on coordinating the announcement of this rebrand so that it coincides with the name appearing on Apple platforms. The developer beta that was just released this morning uses the new rainbow gradient Apple TV logo (“a vibrant new identity,” I guess), but most of the interface elements are still labeled “Apple TV+.” Maybe next beta?

At least now, when we’re talking to normal people and they say things like, “I’m watching The Studio on Apple TV,” we won’t have some little gnawing urge to say, “You mean Apple TV+?”

Think of how much more likable we’ll all be after that! What a real plus!


  1. If you do have a time machine please don’t waste it on Apple product names. We need real help. 

[Joe Rosensteel is a VFX artist and writer based in Los Angeles.]


By Joe Rosensteel

Apple is finally a carmaker

A red toy car balances a large red apple tied with twine on its roof against a white background.
(leonov.o / Shutterstock.com)

I had a thought while I was watching the latest iPhone launch event—other than, “This is what I’m doing with my free time?”—and it’s that iPhones are basically cars. It finally did it. The real Project Titan was the iPhones Apple made along the way.

Ben Thompson wrote a piece on Stratechery where he was inspired by the “sugar water” Steve Jobs story, but I feel like we’re in beep-beep, vroom-vroom town for sure.

Every fall, we get a new model, and there’s always a debate about whether or not it’s a significant update or if there’s a new body style. Even though the iPhone 17 Pro is a totally new design, it maintains an overall appearance that’s evolutionary. There are also certain invisible updates in materials and processes that engineers are proud of but aren’t always apparent to buyers. It bonds the ceramic coating on at the factory, etc.

They’re still big changes, and Apple’s marketing team uses a lot of words for them, but it’s like expecting the general public to watch unveilings to get info on breakthroughs in adaptive dampers.

On a less savory note, Apple is also like an automaker in that we have big debates about how and where the products are manufactured. Where the jobs are and who can take credit for them, and economic investment. These conversations have recently been with people in very high places, where Tim Cook talks about all the good Apple does for America. It brings to mind the famous misquote of Charles Wilson:

Senator Hendrickson: Well now, I am interested to know whether if a situation did arise where you had to make a decision which was extremely adverse to the interests of your stock and General Motors Corp. or any of these other companies, or extremely adverse to the company, in the interests of the United States Government, could you make that decision?

Mr. Wilson: Yes, sir; I could. I cannot conceive of one because, for years, I thought what was good for our country was good for General Motors, and vice versa. The difference did not exist. Our company is too big. It goes with the welfare of the country. Our contribution to the Nation is quite considerable.”

Apple even gets into anti-union shenanigans and regulatory capture, just like a real car company!

It even has a whole suite of financial services to help lower the barrier to iPhone sales, like the financial incentives offered for cars. With the iPhone Upgrade Program, you’re basically leasing.

Let’s look at what’s on the showroom floor:

  • iPhone 16e is a decontented older mid-size SUV. Its backup camera is a lot worse. It offers a lower starting price and can be used to talk people into considering spending a little more for the iPhone 17.
  • iPhone 17 is a mid-size SUV. You can haul your stuff and use it for work or personal needs. No one’s going to think negatively about you owning one.

  • iPhone Air is a luxury compact electric car. It doesn’t have great range, and isn’t as practical, but it’s smart-looking and well-appointed. These buyers want to stand out a little.

  • iPhone 17 Pro is the luxury full-size SUV. It can do anything, but it’s expensive and huge. Almost no one will use all the features it offers, but it’s comforting to know they’re there in case they suddenly need them.

Naturally, this means the general public treats smartphone buying like they treat car buying. Some people want the newest, fastest ones. Some people think the old, smaller ones were better. People will buy specific models just because they come in a certain color. Others don’t care about any of the details as long as it has cargo space and isn’t “too expensive.” Maybe the kid will get the old one because they’re going off to college.

I’m not saying that makes the iPhone bad, or that a lack of excitement is a sign of failure. It’s simply a mature commodity. Just as with cars, there will always be people who want the latest and the greatest. Either because they want the best, or they want the status.

Back in 2007-2011, the iPhone was breaking new ground in capability; in what was functionally possible with software that worked on a global, mobile internet. Today, the smartphone has been integrated into all of our lives as a commonplace thing. We live in a market saturated with devices that are as capable as new devices, even if they are objectively worse.

Apple might engineer the best, most efficient, most refined updates to the iPhone hardware, but there’s no functional difference to explain to buyers how Apple is revolutionizing what the customers are capable of doing.

Both Ben Thompson and Jason Snell highlight Apple’s inability to produce compelling AI software that can enhance what customers can do with new iPhones. I largely agree—it doesn’t have to be LLM chatbots, but Apple hasn’t offered any alternative ideas for improving everyday quality of life.

Apple could have put 512 neural cores in the A19 chip, and it wouldn’t really matter, because Apple can’t tell us how buying a phone with that chip would improve our lives other than generic benchmarking. They have the fastest lap time around the Nürburgring, but no one has a Nürburgring at home.

I hope people enjoy their current cars/iPhones if they choose to buy or lease one. And if you don’t choose to get one, I wouldn’t feel too bad about not being overwhelmed with excitement. The next time you need one, you’ll get one, and it’ll be the culmination of all the best tech you missed. Think of all the tech that’s trickled down in both cars and phones over the years. Just like that.

If you’re pumped, and you have your preorder up, and your Apple stickers on your rear windshield, then good for you. You’ll get the best car Apple has made yet.

[Joe Rosensteel is a VFX artist and writer based in Los Angeles.]


By Joe Rosensteel

A better camera app? Reflections on Adobe’s Project Indigo

Smartphone screen displaying a photo of ducks on a grassy island in water. Zoom options (0.5x to 10x) and settings (Photo, Night, RAW + JPEG) visible on the sides. A histogram and exposure settings are also shown.

I appreciate what Adobe is doing with Project Indigo. It’s a free iOS camera app, but it is heavily disclaimed as being experimental with unique features you can’t find in other apps. But Adobe also says they’re targeting “casual” photographers, which seems misguided.

A few people I know have even been evangelizing Project Indigo because they love it so much, especially when they compare it to photos from Apple’s Camera app. My enthusiasm for this product doesn’t match their own. It’s neat but it’s not great.

It isn’t all-purpose (it can only take still photos), and it can’t do panoramas or portrait mode. It doesn’t have the compressed storage of the editable HEIC files Apple introduced with the iPhone 16 Pro, or the new photographic styles pipeline that lets a user control tone mapping and certain processing, both before the photo is taken and after the fact.

There are still a few noteworthy tricks it pulls off that are worth a look.

Continue reading “A better camera app? Reflections on Adobe’s Project Indigo”…


By Joe Rosensteel

Michelin mess: How Apple Maps fumbles location details

Screenshot of three restaurant listings: Osteria Mozza, Chi Spacca, and Pizzeria Mozza. Each shows a map, contact info, images, and descriptions. Highlights include 'Food & Drink' and 'Atmosphere' sections, with ratings of 4.8.

Apple Maps navigation might be on par with Google’s these days, but Apple’s location data is not. Google offers broad coverage for many points of interest, while Apple’s data has mostly relied on knitting together bits from competing business partners. This attempts to mimic Google’s comprehensive coverage without Apple having to do the foundational work itself.

Apple recently announced it would integrate data from Michelin Guide (prestigious/exacting), The Infatuation (trendy/young), and Golf Digest (retirees/executives/awful world leaders). While initial partnerships seemed shrewd for bootstrapping Maps data, Apple now appears content to make the entire platform out of boot straps.

This approach layers on top of existing partners like Yelp, OpenTable, TripAdvisor, and Foursquare, not to mention numerous international partners. Let’s focus on restaurants, the core of the Michelin Guide’s focus.

Guides by Michelin Guide, not Michelin guides

Michelin integrations remain limited to the U.S., three months post-announcement. (Sorry, France!) However, I don’t think anyone is truly missing out, as the Michelin integration offers very limited value.

Curiously, there’s no “Michelin Guide” within Apple Maps’s Guides feature. Instead, some cities feature Apple Maps Guides created by Michelin Guide to highlight specific restaurants. For instance, “Best Korean BBQ in Los Angeles” spotlights Korean BBQ restaurants, but only one has a Michelin rating, and its specific rating isn’t indicated within the Apple Maps Guide interface. You must visit each linked location to find out. Why are the rest unrated? It’s a mystery.

To find all Michelin-rated restaurants in Los Angeles, users must search for “restaurants in Los Angeles” and manually toggle filters for every type of Michelin Guide rating in the Maps view.

Everyone gets a rating system

What do these ratings tell us? Consider three Nancy Silverton restaurants operating from the same building:

  • Osteria Mozza — Michelin-rated with one star and a green star. It has a 4.8 on OpenTable.
  • Pizzeria Mozza — This casual pizzeria has a Bib Gourmand. Its Michelin “about” section focuses more on the proprietors than noteworthy dishes, and it’s rated 4.7 by OpenTable.
  • Chi Spacca — It has a Green Star1 from Michelin and a 4.8 on OpenTable.

It’s striking that Apple Maps has captured not a single Apple Maps rating for these three notable restaurants, nor for the vast majority of Michelin-rated restaurants in LA.

Apple Maps never presents all available ratings and reviews for a location. Yelp and TripAdvisor have lower ratings for these restaurants, but because these three use OpenTable for reservations, only OpenTable ratings and reviews are shown, which consistently trend higher.2

Consider two other nearby restaurants with differing review systems in Apple Maps:

  • Jon & Vinny’s — The original Fairfax pizza and pasta location somehow has a Bib Gourmand and a 3.5 on Yelp. How does one compare this to Pizzeria Mozza (Bib Gourmand, 4.7 on OpenTable)?
  • Gucci Osteria da Massimo Bottura — This Beverly Hills restaurant has a Michelin star, a gushing Michelin “about” section, and a lower-than-expected 4.0 on Yelp. How do you choose between this and Osteria Mozza if you’re picking a Michelin-starred Osteria in LA?

Searching Maps for “restaurants” and selecting “Top Rated” reveals no clear pattern for what “Top Rated” signifies. It pulls from a patchwork of rating systems and random Apple Maps Guides. Some listings only have Apple Maps ratings and no other reviews, even when Yelp or TripAdvisor data exists. An “Overall” score of 84%3, for example, is supposedly enough to deem a place “Top Rated.” Not in any school I ever attended, but sure.

How will this hodgepodge improve when The Infatuation’s 0-10 ratings data is added? What new assortment of metrics will define “Top Rated” then?

The menu problem

Deciding on a restaurant often hinges on its menu, yet this isn’t a primary consideration in Maps. It’s almost always hidden behind a “More…” button. Apple frequently relies on third parties for menus, often displaying “Menu” with a Yelp icon, rather than linking directly to the restaurant’s website menu.

Apple is indexing the whole internet, but can’t index restaurant menus? While many restaurants fail to keep their online menus updated, an effort is still needed. This failure to index menu information also explains why searching for specific dishes or cuisine in Apple Maps is ineffective. A search for “khao soi” in LA yields only “Khao Soi Thai,” not the many restaurants offering the dish. “Khao soi noodle” incorrectly suggests places like Lan Noodle, which doesn’t serve it.

Yelp, an Apple partner, handles this search better within its own app, as does Google Maps, both displaying the expected Thai restaurants.

Perhaps Google excels because it indexes these menus. A Google Maps restaurant listing prominently features “Menu” after “Overview,” linking to the restaurant website and displaying user-submitted photos of physical menus with dates. It’s imperfect, but it’s an active attempt to solve the problem, not bury or outsource it.

The camera eats first

Photos convey much about a restaurant and its dishes. Apple Maps offers photos, but often without dates or captions, sourced from various third parties, including Michelin Guide, and potentially dating back to Foursquare’s early days. Yelp sometimes requires a deep link to its app to view photos.

There’s no way to filter photos by source service (to exclude those requiring an account) or to show only Apple Maps user submissions. While Apple Maps allows user photo uploads, users can’t caption or categorize images, and there’s no alt text for accessibility. While privacy-preserving in allowing anonymous uploads, the utility is questionable, especially the setting allowing Apple to share uploaded photos with partners.4

Google Maps, however, provides photo captions and can even surface photos of frequently mentioned or photographed menu items.

Reviews you can use

Apple Maps’s scoring system, which presents results as percentages, offers little actionable information. What does 87% for Food & Drink mean? We need words to make sense of “thumbs up” or “thumbs down” feedback.

While user reviews can invite attention-seeking behavior (thanks for reading my column), they contain valuable data best shared in writing, not as a binary metric. Aggregated reviews are particularly useful, and AI summaries, despite their flaws, can highlight the frequency of compliments or complaints.

Ironically, Apple uses AI summaries for App Store reviews (where grievances are common) but not for restaurants, where people are often motivated to rave about a great meal.

I have reservations

Apple’s reliance on OpenTable for reservations overlooks other services. Sometimes Yelp is offered, or Yelp’s wait lists appear under “More….”

The Michelin integration now allows reservations via Michelin, but this isn’t native; it routes you to Michelin’s site, listing services like Resy—which Apple doesn’t partner with, despite my having its app.

Google Maps’s “Reserve” button either offers an app/service picker or a seamless in-Maps reservation process regardless of service. Apple should emulate this, avoiding routing users through Michelin Guide for a service-agnostic reservation list, especially when Michelin Guide covers only a fraction of restaurants.

Thumbs down

I could continue, and certainly for other location types beyond restaurants, but I believe I’ve made my point: every new Apple Maps partner merely adds another incomplete layer of data. The underlying problems persist because Apple relies on these external sources rather than genuinely investing in its own internal ratings, reviews, or photo capabilities. As such, their data remains largely unhelpful.

The Michelin Guide, The Infatuation, or any “expert” source will never cover every restaurant in every city. As for user reviews, OpenTable users are limited by its business model, and both Yelp and TripAdvisor offer more features and consistency in their own apps than Apple provides in Maps.

Google Maps, however, offers a one-stop shop for ratings, reviews, menus, reservations, and even real-time busyness data, all directly comparable to surrounding places and usable worldwide.

It would be beneficial for Apple to expand its first-party data and incentivize users and business owners to contribute fresh, relevant information to points of interest with the same volume and frequency as other platforms. There are so many iPhone users writing reviews, and submitting photos, but they’re not doing it in Apple Maps. Why not incentivize those users to post that data directly at this point, instead of piecing it together from business arrangements made with different providers that do collect the data?


  1. Apple Maps doesn’t provide any information on deciphering the Michelin rating system for the uninitiated. People know stars are good, and more stars is more good, but they absolutely don’t know why the tire guy is licking his lips, or why there’s a four leaf clover. You can’t tap on them for an explanation. For your own edification, the Green Star is awarded to restaurants that are “role models” for sustainable practices and can be awarded to any Michelin-rated establishment, but shouldn’t be read as an additional star. None of that data maps to any other non-rated restaurant you will look at Maps. 
  2. I have only been able to find a single restaurant in all of Los Angeles that has a rating under 4.0 and it’s a single location of Red Lobster, which has a 3.7. Partially this is the fault of ratings systems relying on 5/5 meaning expectations were met, but it’s also biased towards the high end more than Yelp or TripAdvisor which also have a 5 star scale. None of these 5 star systems can be directly compared, even though Apple places them in the interface as if they are interchangeable peers. 
  3. Overall doesn’t mean that it’s an average of the other three scores. Overall is an independent evaluation, so Planta (a vegan restaurant in LA) has an overall score of 84% from 37 ratings, 95% for Food & Drink from 22 ratings, 90% for Customer Service from 21 ratings, and 100% for Atmosphere from 22 ratings. Mathematically, this is a perfect system for “Top Rated”. 
  4. There is a toggle in Maps settings for: “Allow companies that provide photos to Maps to use the photos that you add to Maps in their own products and services. Photos include their locations but not your identity. If you turn this off, photo providers may no longer use your photos, but this may take a few days to apply.” Seems cool and fun. 

[Joe Rosensteel is a VFX artist and writer based in Los Angeles.]


By Joe Rosensteel

tvOS 26 brings minor additions and weird priorities

Note: This story has not been updated for several years.

A screenshot of the TV app in tvOS 26 beta 1 showing a splash screen image for WWDC 2025. The text overlay on the image refers to it as a 'Movie - Special Interest'
They should have given it a theatrical run.

Apple has largely tied major revisions of tvOS to the launch of new Apple TV hardware over the years. Since the introduction of Apple TV+, WWDC’s tvOS “features” have largely focused on showcasing sizzle reels of Apple TV+ shows, and very little about tvOS itself. This WWDC gave us a trickle of announcements that don’t seem to align with what I would consider to be the rough spots in the tvOS user experience.

It is possible that Apple is holding back meaningful revisions until they launch an updated Apple TV box this fall. Maybe they’ll even mention the 10th anniversary of tvOS itself, which was unveiled in September of 2015 at the iPhone 6S launch event. Until then, I guess we should reflect on what’s announced, instead of wish lists of what could be.

Through a glass, darkly

I’m not going to rip into the design in beta 1. It’s also mostly a conservative evolution of what came before but with highlights on edges. However, Apple has really underscored a very specific part of the interface as working as intended and I will push back on that.

Apple has two kinds of Liquid Glass (Regular and Clear) and Clear is supposed to be used over rich media, like video. The only things that define the existence of the controls are the highlights and brighter/blurry refractions visible through the clear elements.

Well, gee whiz, aren’t clear glass playback controls going to be difficult to see over video, especially when it’s playing through the controls?

To make the controls easier to discern, Apple applies a dimming layer on everything around the controls, but not on the video visible through the controls. It’s like someone stenciled out aftermarket window tinting.

Apple says this is on purpose in its Meet Liquid Glass WWDC video, when demonstrating playback controls on iOS. In its Newsroom post for tvOS, it says: “tvOS 26 is designed to keep the focus on what’s playing so users never miss a moment.”

This is bananas. How is this getting out of the way of the content? You can barely discern the playback timeline and playhead while motion is occurring through the element, which causes it to pulse in a thin strip. What is being achieved here? The playback controls and timeline should be flat. No one is going to feel sad that there’s no glass effect in this one spot, where it serves no practical or artistic purpose other than being a wicked smart shader demo.

Poster through it

Another notable change in the interface is the pivot from horizontal thumbnails to portrait-orientation posters. Apple says that this means more tiles can fit on the screen, but that’s only more tiles visible in one row, and it’s only one additional tile over the smallest scale thumbnails (6 posters instead of 5 thumbnails). The older design had thumbnails that matched the aspect ratio of the TV in various sizes so you’d get more rows with fewer titles visible on screen in each row.

To compensate for this difference in aspect ratio, the text that was below or next to the thumbnails is now on top of them. I’ll let readers debate which is more legible, and whether or not the text is always helpful.

tvOS 18.5 (left) versus tvOS 26 beta (right).

This decision pushes content downward. If you want to see what kind of category you’re in the mood for, you will do more scrolling down, which means it will take you longer to count the number of times the TV app recommends you watch “Stick.” Unless you really want to flip through one particular row of the interface one title faster, it’s not really an improvement.

Used any good profiles lately?

I’m unclear about the continued push by Apple to get developers to adopt Apple’s user profile system. It really doesn’t provide any benefit to the developers of these large streaming services that need to have their own multi-platform profile systems with personalized content recommendations, and it doesn’t provide substantial benefit to households with shared viewing.

A screenshot of the tvOS profile selection screen. It shows a user profile for Joe and a user profile for Jason with a semi-transparent '+' over the corner of Jason, and another '+' next to that. At the bottom of the screen the 'Don't show this screen again' button is highlighted.
Someone had the forethought to include this button in beta 1.

I have no animosity towards user profile improvements whatsoever, and I do appreciate that on your first boot of tvOS 26 you can say you never want to see the profile switcher. However, system-level user profiles just don’t feel like the area of the TV viewing experience that needs this much attention when compared to other aspects.

If I were being generous, I could hypothesize that this emphasis on user profiles is because there will be some genuine effort put into personalizing the TV app based on the active user profile.

Unfortunately, you still can’t express any kind of preference in “personalized” areas of the interface to mark a recommended show as watched (without first adding each title to your Watch List and then marking it there) nor can you express that you have no interest in a title.

Even if increased personalization is on the horizon, there’s no reason to expect that to work as well as the personalization offered in each streaming app’s own recommendation systems. Such a thing requires developer participation and cooperation with Apple.

Speaking of developer participation…

Just keep adding single sign-ons until one of them works

The 10th anniversary of Single Sign-On is next year, so we’ll be celebrating this latest attempt a little early. That first attempt used a convoluted system to recognize your cable provider to authenticate all the individual apps you had that worked with existing subscriptions so you wouldn’t have to sign in. Just 18 months later Apple announced zero sign-on, where if you were on a qualifying provider’s internet network, the apps would authenticate on their own.

It’s safe to say that these systems almost immediately became obsolete because they were centered on a business relationship between customers and service providers that was in quick decline. Apple’s blind spot here was believing that anything not subscribed to via a cable provider would be subscribed to via Apple. Due to Apple’s App Store policies on subscriptions, many streamers have left the App Store behind. That means people have to do little sign-on dances that makes using Apple products as frustrating as cheap streaming hardware.

Instead of repairing its relationships with streamers, it’s providing this very latest sign-on feature, which links accounts via your Apple Account email address… but requires streamers to want to implement it. I hope they do, and I hope it works to make everyone happier.

Sing out loud, sing out long

I find myself scratching my head at the announcement regarding using iPhones as microphones to do Apple TV-mediated karaoke.

Look, this feature won’t hurt me, or cause harm to the world—with the possible exception of those within earshot—but it’s such a niche thing to do. I have to imagine that someone took a look at the collection of technologies that Apple had built and realized they could put them together, you know, for fun!

I hope people who use this feature do have fun. But it’s a strangely specific thing to use as a selling point, when there are other use cases for the Apple TV, such as watching television, that might be better places to focus.

Give me more

I want tvOS to improve, and am frustrated when another WWDC comes along and the changes are as minor as they were this year. I hold out some hope that there’s more to announce, and it’s being held back on for a new Apple TV hardware announcement. But for now, we’ve got tvOS 26… and it cuts down on information density and creates make see-through timelines.

tvOS needs to sort out the dichotomy between the home screen and the TV app. The current TV app is a mess and needs to be upgraded to support features that Apple has never taken a single pass at, like a universal live guide. I don’t expect them to be perfect, but it would be nice if we could see that Apple is making an effort. Change is long overdue for a platform that many take for granted. Apple needs to try harder at the TV part of tvOS.

[Joe Rosensteel is a VFX artist and writer based in Los Angeles.]


By Joe Rosensteel

The new Spotlight for macOS 26 shows a path forward

Note: This story has not been updated for several years.

On Thursday there was a Six Colors Zoom call for Backstage-level members and contributors alike. Glenn Fleishman asked Jason Snell and Dan Moren about Spotlight. He wondered about the discoverability and the intuitiveness of some of these features. Jason mentioned that Apple views the features as power user features that don’t get in the way if you don’t know what they are. Dan said it would still be nice to have documentation of what all the features were, because it was difficult to know exactly what all the command functions are otherwise.

I piped in with my view that the real missing piece is natural language processing so people aren’t trying to discover commands or read documentation. We still need those other things, but to make this truly accessible we can’t expect everyone to memorize all the Quick Keys.

In March I wrote an opinion piece for Six Colors lamenting how text-to-Siri pales in comparison to typing a web search into your browser. I also compared text-to-Siri to Spotlight which handles searching better, but can’t process natural language requests. What I wrote in March is much broader in scope and encompasses requests like product knowledge.

Apple still isn’t doing any of that right now, but with App Intents and Quick Keys in Spotlight it’s creating the explicit command syntax that could be fed by something interpreting a natural language request.

Think of it like this: this year they’re writing grep, sed, pine, ffmpeg, etc. for Spotlight. A common issue for people is not knowing how to structure commands and turning to the web, and LLMs, to copy and paste arguments and flags for those powerful tools. They’re more accessible when people don’t have to figure out the flags and arguments themselves, but the explicit commands you pass them are still the foundation for what’s doing the actual file operations.

Jason said on the call that he thinks that this missing puzzle piece might be as early as next year, and that it seems like the next logical step. It certainly seems more achievable with a foundation like this laid.

Hopefully bozos like me aren’t writing blog posts in two years asking where it is while we ask LLMs to compose our Spotlight queries for us. I’m thinking positive thoughts, though.

[Joe Rosensteel is a VFX artist and writer based in Los Angeles.]


By Joe Rosensteel

To improve CarPlay Ultra, Apple needs to fix CarPlay

Here’s the problem with CarPlay Ultra: It’s still CarPlay.

Based on what we’ve seen of CarPlay Ultra, Apple believes that if it controls the appearance of the displays in cars, then using the car will be a good experience. I’m not sure that’s an assumption I’d make, especially when styling isn’t directly connected to function—as is the case with most of what distinguishes CarPlay Ultra from CarPlay.

A screengrab from the Top Gear video on Aston Martin's implementation of CarPlay. The camera shows us the center console displaying vehicle settings on the short, but wide screen. Labels for the settings are aligned to the left side of the screen, and the current setting is aligned to the right side of the screen with a large gap between them. Also, everything is gray on gray with narrow text.
The real hallmark of Apple is a bad settings screen. (Image: Top Gear.)

There’s so much more Apple needs to do with CarPlay, fixes that would also benefit CarPlay Ultra. I use CarPlay all the time, and there are plenty of issues that don’t seem to be on Apple’s roadmap. If Apple improves CarPlay, it also improves CarPlay Ultra. That being said, here are some of my biggest outstanding issues with CarPlay today.

At the center of things

Whether you’re driving a fancy car with CarPlay Ultra or you’ve just got basic CarPlay, the interface in your vehicle’s central touchscreen is the main stage of things. In early CarPlay Ultra demos, that very familiar CarPlay interface is still front and center.

The entire approach to notifications needs to be rethought. When a new notification appears, it displays for a second and then fades away. If you’re busy driving the car and, you know, paying attention to the road, you won’t know that you have missed the text message that your friend is running late or has canceled. If Messages is not in the dock, there is no visible badge, and it’s not added to the dock based on incoming notifications, but rather on when you last used it.

A glanceable, non-distracting indicator that there are active notifications that need attention would be nice. Perhaps Apple could even use some of that vaunted Apple Intelligence to detect what sorts of messages were a priority in the context of driving a car.

When notifications appear, they also float above existing tap targets in the interface. If I am parked and trying to select my dentist’s office in Apple Maps, a calendar alert reminding me to go to the dentist will appear and block me from completing my task. CarPlay Ultra adds even more new overlays, like vehicle warnings and climate controls. I don’t know what the answer is—push down the screen? have a dedicated area of the screen for warnings?—but it’s a problem in need of a solution.

Organizing the apps displayed on CarPlay could also be improved. Right now, this is accomplished by using the Settings app to reorder the list of apps on a per-vehicle basis, but the vertical list offered in Settings doesn’t match how those items are displayed in their icon grid in the car! Since the settings are per-vehicle, Apple knows the exact dimensions of the screen, so it knows how many rows and columns there are, and where the page breaks will be. It should also be easier to sync these layouts across devices. I’m not a current Apple Music subscriber, but it’s the second app in any default CarPlay homescreen, and there’s nothing I can do to prevent that from appearing in every rental car I connect to my phone.

Connectivity quirks

I had a Honda with CarPlay, and my boyfriend and I currently share an Audi with CarPlay. Even though both use wired connections, both periodically flake out. We’ve rented numerous cars with both wired and wireless CarPlay when traveling, and there has been no consistency in connectivity in any of these vehicles. The wireless version in one Chevy car had unacceptable lag that made the screen unusable, requiring a wired connection. In a recent Toyota rental, the wired connection didn’t work, but the wireless connection was rock solid.

There’s no quality guarantee from Apple or automakers about how well CarPlay will work with any given car, but I’ve built a mental list of which cars seem to work better than others through trial and error. That list informs my animus toward certain makes and models that can persist even if the CarPlay experience has improved, because there’s no rating system or seal of approval. I’m not sure what Apple can do here, but some sort of CarPlay certification process might allow Apple to inform automakers about choices that lead to unreliable connectivity and unhappy customers.

Apple also needs to improve its attention to detail when it comes to CarPlay: it recently broke CarPlay connectivity for some people with the release of iOS 18.4, and it took two weeks for Apple to ship a fix in 18.4.1.

CarPlay Ultra disconnects won’t affect the instruments and essential functions of the car because they’re rendered locally by the vehicle. I have no safety concerns about dropped connections. However, we haven’t seen how gracefully the phone-generated part of the non-essential interface degrades when there are connection issues. I don’t believe Apple wants to be the one to show people anything less than ideal function, even if we all know that’s not realistic.

Regardless of what the connection failure states are: If Apple pushes out a buggy iOS release again, will people drive their CarPlay Ultra cars around with only essential, locally-rendered instruments for two weeks, or revert to their car’s interface and be hesitant to go back?

Talking to Siri

Ideally, when you’re driving, you’re not fiddling with touchscreens, but talking to Siri and keeping most of your attention on the road. I believe it’s one of the reasons Apple marketing VP Bob Borchers said, “This next generation of CarPlay gives drivers a smarter, safer way to use their iPhone in the car.” (Emphasis added.)

CarPlay Ultra isn’t adding or augmenting lane guidance, crash avoidance, or self-driving features, but in theory, it’s safer because you can now tell Siri to turn on the seat warmer.

But we’ve all used Siri. It doesn’t just fail, but can also execute the wrong command with utmost confidence, causing a distraction! With CarPlay Ultra, Siri can now cause a distraction over car functions, not just by playing the wrong music.

There’s also another issue at the crossroads of Siri and connectivity, and that’s what happens when Siri can’t connect to the Internet. I’m sure you’ve all had the pleasure of getting in the car, pulling out of the driveway, and saying, “Hey Siri, give me directions to a place,” only to have it spin or glow and give up. Not only can it not get the directions, but it also eats the command, and you have to say the whole thing over again.

This needs to be smarter. The iPhone should recognize that since it’s just connected to a car, its nearby Wi-Fi connection is likely to disappear, so prioritizing the cellular network might be a smart move. And if there is a temporary connectivity failure, perhaps Siri should hang on to that command and send it again when connection resumes, or offer to resubmit the request instead of requiring me to do it personally.

(Remind me: I’m a person and my iPhone is a computer. Which one of us should be doing the repetitive tasks, again?)

In the event of a failure, I also never notice Siri attempting to use the iPhone’s on-device dictation model to decode my instructions and pass them on to Apple Maps, which has been helpfully preloaded with offline maps.1 Remember to be online when you want to use your offline maps.

When sharing isn’t caring

The car has a volume settings for audio playback, and separate ones for navigation audio, but it isn’t per-device, so the different audio settings on my iPhone and my boyfriend’s iPhone result in one of us getting into a very loud or very quiet car, or the navigation audio being too loud for him in Google Maps and too quiet for me in Apple Maps.

This is the lowest level annoyance of all the annoyances, but it’s worth mentioning in light of how it might apply to CarPlay Ultra. To what degree are my settings carried over to my iPhone, including climate, radio, and instrument cluster layout? To what extent does my iPhone simply set those things in the car at the time of my request, and then pick up whatever state the settings are in when my iPhone reconnects later?

If it’s like audio settings are right now, where the settings are just whatever they were when the last person drove the car, then what are we even doing with our smartphones connected to these cars instead of relying on Android Automotive profiles?

It’s even more complicated when both of us are in the car with our individual devices. With wired CarPlay, the phone plugged in is the CarPlay phone. But with wireless CarPlay and multiple phones, it’s a crapshoot—it’s which phone gets in range first, or maybe which one was connected most recently. CarPlay doesn’t offer a switcher if it connects to the wrong phone, or if you just want to switch from one phone to another.

When the locally rendered instrument cluster in CarPlay Ultra boots up before it connects to my iPhone, is it what my boyfriend had the instrument cluster set to? Does it change to mine while I’m using the car, and back to his, or will we be overriding each other each time we connect to the car, as we are currently with volume settings? Are we overriding each other’s climate settings?

I would love to know if CarPlay Ultra offers a more seamless user switching experience, but I’m unsure if it has occurred to Apple that we’re not a two-Aston-Martin household.

Put it in the parking lot

Apple improving CarPlay would help everyone. It would be a better sales pitch for CarPlay Ultra, because “All the same annoyances as before, but across your whole dashboard!” is not a great slogan.

I would never buy another car without CarPlay, because even when it’s flakey, or Siri bumbles something, it’s handling my media and my personalized navigation better than any car can. I can’t say the same thing about CarPlay Ultra, which feels more like applying an iOS-styled WinAmp skin to the speedometer. For CarPlay Ultra to succeed, Apple needs to do more than woo reluctant automakers. It needs the discipline to address the long list of existing CarPlay annoyances. A rising tide lifts all boats. Er, cars. You get what I’m saying.


  1. If you put your iPhone into Airplane Mode and disconnect from Wi-Fi, you can ask it for directions to points of interest stored in your offline maps, and Siri can’t do it. You can open the Maps app and use speech-to-text dictation in the search field to get directions. Shocking, I know. 

[Joe Rosensteel is a VFX artist and writer based in Los Angeles.]


By Joe Rosensteel

Roku’s winning strategy is ads. What’s Apple’s?

A photo of an Apple TV box with a Roku voice remote on top of it showing the Apple TV+ promotional button.

Last week, Roku held a press event in New York where they unveiled their latest streaming devices, wireless cameras, and minor adjustments to their existing, content-driven interface. If you were hoping for a dramatic update to Roku OS, Lucas Manfredi has the disappointing details over at The Wrap:

The platform introduced a “Coming Soon to Theaters” row and personalized sports highlights. It also launched short-form content rows in the All Things Food and All Things Home destinations for users to easily find smaller curated clips, from recipe tutorials to home organization hacks. It also unveiled badges to help users differentiate between free, paid, new and award-winning content.

If you have used Roku devices or TVs recently these announcements seem disproportionate to the scale of the event where Masaharu Morimoto served sushi, and puppies were available for adoption.

The hardware devices themselves don’t do anything novel over existing devices to justify this fanfare. That’s not that surprising when you consider that Roku loses money on its hardware. Ars Technica’s Scharon Harding summarizes it well:

For a clearer picture of how critical ads are to Roku’s business, in its fiscal Q4 2024 earnings report shared on February 15, Roku revealed that its devices division lost $80.4 million during the fiscal year. Meanwhile, its platform business, which includes Roku OS and its advertising arm, reported about $1.89 billion in gross profit.

Roku is the number one streaming platform in the U.S. It has been able to place promotions and ads in such a way that they drive consumers to shows and material, and has done so in a way that has mostly only grown its user base and the value of its promotional real estate.

Roku recently tested the limits of its customers by displaying an ad before the customer gets to the Roku interface. Chris Welch from The Verge asked Roku’s ad marketing lead, Jordan Rost, about that mess:

Rost didn’t say as much directly, but it’s apparent that Roku was keenly aware of the bubbling up of complaints. “Advertisers want to be part of a good experience. They don’t want to be interruptive,” he told me.

“We’re always testing. We listen to consumer feedback, we do all of our own A/B testing on the platform. We’re constantly tweaking and trying to figure out what’s going to be helpful for the user experience.”

I never expected the ad guy to say, “Ads suck!”, so this is completely in line with my expectations. Welch also asked him about that notorious patent to inject ads into the streams of non-Roku content. Again, he said nothing shocking:

He said Roku’s own platform is the “primary” focus of its ads strategy. But last month’s misstep isn’t going to stop the bigger plan to keep pushing to make ads more shoppable, interactive, relevant, and “delightful.”

I support Chris’s use of quotation marks around “delightful,” even if he was directly quoting Rost.

Recently, I had occasion to use a Roku 4K+ for a few weeks as my primary TV streamer, and it’s not all terrible ads top to bottom. As much as I might complain about ads, I see why most people don’t. Not because the drooling masses don’t know any better, but because everyone has different thresholds for advertisements and promotions.

The famous Roku City screensaver is actually a good metaphor for this. There’s a car driving through a city where there are various illustrated storefronts and billboards. The ads populate the places a person would see these things in real life—for example, I saw an ad for The Home Depot on the side of a building. It has absolutely nothing to do with entertainment whatsoever, but it’s more subtle than an autoplaying video before you get to the home screen. (People even have some strange affection for this screensaver, even though it’s an ad vehicle.)

Roku’s ads are mostly banner images that remind me of the old days of the web. They don’t even take up as much screen real estate as Amazon’s Fire TV interface bludgeoning poor Jason with mattresses. Roku’s content-driven interface obviously has value—otherwise advertisers and studios wouldn’t pay for placement there. The same goes for Amazon.

Every person will have their own tolerance level for advertising tested by the array of devices and services that they can use to watch TV, what the ads are for, how they are delivered, and how much they paid for the streaming device that shows it to them. Everyone will have a different threshold.

Suppose you’re weighing the difference between a cheap streamer box and paying maybe $100 more for a premium Apple model. In that case, Apple might be able to make the case that—despite its overbearing promotion of Apple TV+ subscriptions throughout the TV app—it provides an experience that’s a cut above the competition in terms of not pushing ads at you from every corner of the screen and using your viewing data to profile you.

But it doesn’t do that. Since 2015, it’s been all about how powerful the Apple TV is.

It can play games (third-party controller not included)! It can be a smart home hub (entry-level model no longer includes a Thread radio)! You can connect HomePods in stereo pairing modes (please buy two, very old, very slow-to-respond smart speakers that will grab requests they can’t act on)! It has Siri (it won’t get Apple Intelligence Siri or access your semantic index)! There are user profiles built into the OS (that don’t do anything)! You can watch Apple TV+ on it (or literally anything else)!

Competing devices tend to be pretty pokey (either because of underpowered hardware, or poorly optimized code), but they succeed because they’re cheap, and they’re just for TV. Apple recycles iPhone chips into Apple TVs, and anyone can tell you they’re overkill for simply streaming video to a TV.

There are rumors that there’s a new Apple TV coming later this year. It’ll likely have a chip that’s closer to the current generation of phone chips, which doesn’t suggest the boxes are getting any cheaper.

Roku and Amazon treat their hardware as a loss leader to get people into a platform where they can be monetized. Apple doesn’t need to do that, but there’s still plenty of room here to make an Apple TV box that just does TV. (Apple used to sell the 3rd generation Apple TV for $70. There’s certainly room under $130 for a stripped-down device.)

Cynically, a person could say that Apple needs to copy what Roku is doing and integrate ads into the interface—just as it’s so deftly integrated ads into the App Store—and subsidize the hardware. I don’t have any interest in seeing it do that.

I’d rather see Apple try to compete with these low-end devices by offering something priced a bit lower, with a revamped content-driven interface. Apple will never be able to match Roku or Amazon on price, but if it could offer a sub-$100 box with good content recommendations, combined with a story about limiting ads and ensuring privacy, it could make a more persuasive case.

[Joe Rosensteel is a VFX artist and writer based in Los Angeles.]


By Joe Rosensteel

Apple really needs that Services revenue now

A screenshot of the iOS Stocks app showing the stock ticker at the top, and the Dow Jones section of the Stocks news, with ads. The ad is for 'Camper Vans | TriviaLibrary. This Sleek Small Camper Cost Lesss Than Many Americans Expect (Take a Peek Inside)
(Slaps roof of camper van) This bad boy can fit so much Apple revenue in it.

The incredibly unpredictable nature of the United States’s trade policy is going to have a profound effect on Apple hardware, not just in the U.S., but globally. Whether or not Apple absorbs the costs from their margins, hikes prices, or starts to move physical production and supply chains around the globe is all up in the air when the trade policy of the United States can vacillate so wildly in the span of days. I withdrew and resubmitted this column with these wild swings. If a new deal is on the table by the time you’re reading this, then just wait a little bit and I’m sure things will fall apart.

As Dan Moren pointed out, software isn’t subject to these tariffs, and could present a path forward. That’s the power of positive thinking. You know what’s easier than trying harder though? Squeezing your customers.

Services, including advertising, are not subject to these tariffs, and they already provide Apple with the growth on Jason Snell’s charts that so enamors investors. That could well be the stabilizing force to offset whatever the hell is going on with trade: dependable, digitally-delivered dollars. At least while the dollar is worth anything.

To that end: We should mentally prepare ourselves for ways that Apple might squeeze more out of its customers in each of its service while everything else is on fire.

Continue reading “Apple really needs that Services revenue now”…


By Joe Rosensteel

Wish List: Siri, Spotlight, and a unified search experience

A screenshot showing the Google search results for scanning and sending a document with Mail. There's a box with the AI summary from Gemini, and then the relevant Apple Support document right underneath.
Maybe this is why Apple executives want Gemini so badly?

There’s a lot of talk recently about Siri being behind the competition. Siri often can’t find what you’re looking for, or what you want to know, and there’s no telling when it might be able to. Many of the requests we make to Siri are basically searches, and when we are unhappy with Siri we turn to search on the web for answers, or in the case of local files or music, we just manually dig it up ourselves.

So here’s a thought for those who might suddenly find themselves in charge of Siri: Search is a foundational element of smart assistants, and the current state of Apple’s search technologies leaves much to be desired.

While all today’s web search engines are placing sparkly and unreliable AI-synthesized answers above everything else, they still generally deliver solid search results underneath. Refining Siri without bolstering the foundation is a recipe for disaster.

Using Siri for search

Apple’s recent announcement that it’s delaying several AI features began with a self-serving sentence about how much people love Siri. You guys know Siri. Among its touted new, revolutionary features was “type to Siri,” a feature that’s not really new (you’ve been able to do that via an accessibility setting for quite a while), but is not a bad idea at all. The problem is that I find myself typing to Siri like I would enter text in a search box. Word choice has a huge impact.

This is inferior to just opening a web browser and typing into a good ol’ fashioned search box. First of all, if you want to ask Siri how to do something, you have to prepend “how” to the request or it might treat your request as something to act on. I also don’t have to worry about a web search engine picking up on a keyword like “email” and trying to compose an email while it ignores the rest of my question just because I didn’t prepend “how.”

Even when Siri parses your words correctly, it’s really that focus on attempting to provide a single result, or perform a single action, that makes it less useful. Like I said at the top, a major factor in the usefulness of any search engine is that you have multiple possible matches for what you entered into that search engine. It’s a powerful tool because you may not have used words that exactly match the title of an Apple Support page, but are close enough that you should consider them.

What if you don’t happen to know the names of all the features for a task you want to do? Let’s say you need to update or change your credit card info. Asking, “How do I change my credit card info?” (See the left of the three iPhone images, below.) It’ll tell me I can do that in the Contacts app (center, below).

Please don’t store your credit card info in the Contacts app. If I ask, “How do I change my payment information?” it’ll tell me to remove a HomePod (that I don’t even own) from the Home app (right, below).

three iPhone screens with confusing Siri output

I have to know the exact words for the three places in Settings where credit card information is stored in order to form a question precise enough that Siri product knowledge will reveal the results for each individual feature I ask for one at a time. If I knew enough to be that specific, then I wouldn’t need to ask.

Searching the web for the same generalized questions works like a charm, but I do have to provide the specific context of the platform I am inquiring about. That’s a key advantage of Siri—it knows the platform I’m on already. When I ask Siri on my iPhone, “How do I scan a document?” Siri is going to return a result relevant for iOS. Unfortunately, it’ll only be the instructions for “Scan Document” in the Notes app instead of all the places in iOS where you can invoke “Scan Document.”

That expectation of context can work against Siri when it doesn’t apply it correctly. A humorous example: If you’re on an Apple TV, and say, “How To Train Your Dragon” into your Siri remote, it will not show you the info for the movie like it would for many other titles, but it will give you some training advice for your dragon. This is the same result you get on all Apple platforms because no context is being used in this instance. Saying “Show me How to Train Your Dragon” (If you’re typing it you need to title-case it or Siri will still give you dragon training advice) will display a list of the movies with that name.

A web search engine doesn’t have this issue, even though it doesn’t have context. It can interpret movie titles before trying to be literal with all the words in a request.

What about Spotlight?

Apple has another brand, Spotlight, that it uses as an umbrella for its various search technologies that return search results, but it’s mostly about finding stuff on your device.

It can’t do natural-language search, though—only Siri gets to do that. If you type a natural language request into Spotlight, it’ll likely put a link to do a web search for your request at the top of the list of search results. It’s not going to parse it into movie, tv show, or song titles unless you happen to have those as files.

That’s a real shame, because it would fit right in with our expectations of searching on the web if we could do that kind of search in Spotlight. Sure, it can still bail to the web, or Siri, if you ask, “Who won the Super Bowl?” but not everything people want to request concerns general knowledge.

Spotlight does a lot of things better than Siri. It displays a ranked list of search results. It live-updates the search results as you continue to type and refine the thing you’re looking for. “Type to Siri” has to digest a complete request, process it, and perform an action or display a blurb.

These two technologies need to work together. Spotlight needs to be able to handle more natural-language requests. Siri needs to be able to display those results when there are multiple, possible, relevant results for a request. We shouldn’t expect Siri, as the magic-sparkle box, to correctly interpret all meaning with no further action required. (Google buries the single-response option under it’s “I’m Feeling Lucky” label. Siri assumes we’re all feeling lucky, all the time.)

Improvements I’d like to find

Providing natural language search can be done in parallel with improving Siri and doesn’t stymy or dismiss the work of that team, and provides both a pressure-release valve and support for whatever Siri is doing.

As a user, I’d like to be able to use natural-language search anywhere there’s a generic search box on an Apple platform, and have the results be predictable. The more context and scope the device also can infer, the better. And offering options to perform different kinds of searches—of the Web, of the Spotlight index, you name it—wouldn’t hurt. That’s the flexibility of providing users with multiple, navigable results instead of a single magic outcome.

I shouldn’t have to turn to a third party like Google to ask about Apple’s platform, especially when Apple just shipped Siri’s product knowledge feature. Apple needs to improve Spotlight, integrate it better with Siri, and provide a more consistent search experience—with options!—across all its devices.

[Joe Rosensteel is a VFX artist and writer based in Los Angeles.]


By Joe Rosensteel

Searching for settings in all the wrong places

A screenshot of Settings search showing no results for 'credit card'
Search and ye shall not find

To say that it has been controversial every time Apple rejiggers Settings on any of their platforms is putting it mildly. It is evident that whatever iOS does with Settings is the gold standard for all the company’s other platforms to follow. In theory, this makes the interface consistent for people jumping back and forth between Apple devices. The reality is that the iOS Settings are quite bad, nothing is consistent, and the attempts to duplicate iOS can only ever produce a perfect copy of a bad system under the most ideal circumstances. Relying on search as a crutch only works if search is capable of providing that support.

Top level

Settings on iOS may not be organized in a way that is important to you, or me, but it’s grouped sometimes by type and sometimes by Apple’s orgchart.

The most important part of the interface is the Search box. It’s right there at the top. It’s not collapsed, or hidden. They want you to use to find the settings you need.

Next are promotions and updates from Apple, if they exist. You might have a free trial of Apple TV+ from the purchase of new Apple hardware, or something else. Apple has decided that this is the second most important function of visiting Settings.

Continue reading “Searching for settings in all the wrong places”…


By Joe Rosensteel

Apple should embrace the live TV grid, FAST

a programming guide showing the web interface for Amazon's FAST offering.

Unless you’re looking at tiles in the Sports tab, Apple’s TV apps would prefer to pretend that live TV is not a thing that exists.

This is bad. FAST, or free ad-supported streaming television, is one of the fastest-growing segments of streaming. Tubi just broadcast the Super Bowl for the first time for free, and many networks offer free news streaming channels.

Cable TV is back, sort of

Leaving aside live sports and news, numerous streaming services now offer “live channels,” which are basically playlists that allow users to tune into a television channel and let the programming wash over them.

You like playlists, don’t you? Sure you do. So the people at the streaming service build a playlist in all sorts of categories—for example, Peacock offers hot and cold running Law & Order, Dick Wolf Chicago shows, Murder She Wrote, SNL, romantic movies, sitcoms, and many more. When you show up, you join in where it’s playing. Some channel providers offer you the ability to pause and resume those streams or start over a show you just joined from the beginning.

It’s basically old-fashioned linear programming with an imaginary DVR attached. The appeal is in the programming itself, whether it’s on a theme or it’s a mega marathon of a single show. It can increase the discovery of new shows, but perhaps more importantly, it can reduce decision paralysis where you just don’t know where to even start. Turn on the sitcom channel and start folding that laundry!

Pretty often, my boyfriend “just wants to watch TV,” so he’ll turn on the America’s Test Kitchen FAST channel. I’m pretty sure everything on it is available on YouTube, and I pay for the premium tier because I hate YouTube’s ads so very much. But he’d rather turn on the ATK FAST channel. He doesn’t have to choose what recipe video to watch, and he doesn’t have to keep picking new ones when the last one ends. The FAST channel just plays an assortment of videos interspersed with the same pharmaceutical ads that are like nails on a chalkboard for me. People are different. Who knew?

TV manufacturers often include FAST channels on their devices, but not in any kind of system-wide interface. Samsung has its uniquely named Samsung TV Plus FAST service for Samsung devices. Roku is the backbone of many low-end TVs, and it offers the Roku Channel (which is not a channel) that mixes video on demand with live TV channels.

The most ambitious, in my opinion, is Amazon. It’s pretty safe to say that I have a love-hate relationship with Amazon’s various TV offerings. They’ve got great stuff, but what a painful mess it is to try to get to that great stuff. A good thing they did was add their unified live TV guide in 2022, where the Fire TV provides a programming guide for all the services and apps you use. All of them—not just Amazon’s FAST channels.

The included FAST channels on Prime Video were recently strengthened with the addition of PBS, which has no ads and is free for Prime Video subscribers. It can streamline the experience of watching PBS for households that have cut the cord and don’t pay for PBS Passport, which requires a minimum $5 monthly contribution.

Amazon will even aggregate some live TV from partners in the Prime Video app on any platform without requiring a Fire TV. For example, if you subscribe to Paramount+ through Prime Video, you get access to the Star Trek channel.

Apple has elected not to participate in any of this. No guide that aggregates what’s available to watch on your services. Apple provides no channels of live programming. Just those sports tiles. Yay, sports.

The power of channels

There are plenty of ways that Apple could benefit from leaning into the idea of live channels. The company recently did a promotion during the first weekend of 2025 where Apple TV+ was available for free to give people a taste of Apple’s shows and movies in the hopes people would sign up. (People of a certain age will remember that cable used to have “free preview weekends” for HBO.)

Great idea, but it would be even more productive if there were channels in a guide with selected and themed Apple TV+ programming to watch. I know that most Apple TV+ shows are heavily serialized in nature, so coming in on episode four of Severance season two wouldn’t be ideal, but that’s what life was like before video on demand. If you catch something that seems interesting, you could subscribe to Apple TV+ and watch the whole thing! (And yes, if Apple wants to experiment with advertising in Apple TV+, one way to do that would be to offer FAST channels with ads.)

The other thing about TV channels is that you can offer seasonal ones. Apple makes the old Peanuts holiday specials available for free, so just put them in a Channel on a loop from October to January. It’s not a year-round channel that needs to exist. Other FAST apps do things like have a fireplace channel. It doesn’t even need to be a TV show!

Apple could even have an Apple Music channel that streams these things called music videos. Apple also has promotional videos and video podcasts they’ve made for Music that get buried and neglected in the Music app (because that’s an awful place to put them) but could find a home in a programming block. There’s potential synergy with Apple Music’s live radio channels, too.

Putting the ‘broad’ in broadcast

I’m really struck by Apple’s decision not to do anything in this area at all—not only for their stuff, but for all the other apps on their platform. There’s a real opportunity here for Apple to give insight into what’s available to watch on the various channels that are available through your apps on your iPhone, iPad, or Apple TV.

The truth is that while some people don’t see the need for linear programming because they know what they want to watch and their preferred experience of using the TV is exclusively on-demand programming, that’s only one way of watching TV. Some viewers just want to turn on a channel and let it wash over them. That was something that traditional TV provided that on-demand streaming failed to, and it’s why FAST has become so popular lately. It fills that need—and when people have TV viewing habits, they tend to stick to them out of comfort.

Apple should support the concept of channels, both for its content and to unlock content across its platform. And it doesn’t even need to stay locked into Apple’s platform. Apple can also partner with other apps, services, and networks to carry Apple’s channels on their platforms. Their live programming can appear right in the Fire TV live guide alongside everything else.

In other words, Apple is losing two ways by failing to embrace live channels. It’s missing an opportunity to promote its content, and it’s making tvOS and the TV apps poorer by hiding the rich streams of content available elsewhere. FAST isn’t for everyone, but it’s for a growing number of people. Apple’s abandonment of this category needs to end.

[Joe Rosensteel is a VFX artist and writer based in Los Angeles.]


By Joe Rosensteel

Apple wrote checks Camera Control can’t cash

Visual Intelligence disclaimer.

The Camera Control button control on the iPhone 16 family seemed like a good idea, but the devil’s always in the details, isn’t it? Apple made too many promises, all of them in conflict with one another because they all rely on using the same tiny hardware feature to function. And as it ships more features, things aren’t getting better.

A half-baked half press

It was definitely confusing that Camera Control was introduced as a shutter button that could also be half-pressed—but the half-press gesture didn’t do the thing it does on every normal camera, which is to lock focus and/or exposure. Apple shipped Camera Control with a complex swipe-and-press interface to move among different functions but said that the most basic exposure/focus function would be coming later.

The new half-press feature is in direct conflict with the original overloaded half-press feature. To enable it, you need to go to Settings and then to Camera -> Camera Control where there’s a toggle for AE/AF Lock.

In hindsight, it’s absolutely the right move to have this feature disabled by default. Not only because most ordinary people wouldn’t want to use it, not just because it is in such deep conflict with the tiny half-press menu overlay for the slider functions, but because it is terribly executed.

First of all, you don’t always want to do both AE and AF lock. Sometimes you do, but not always. We’ll set that aside for now. The way that the iPhone handled AE/AF before is that you could tap on something, and it would set focus and exposure for that region you tapped. If that subject, or your camera, moved, then the temporary lock would go away. If you tapped and held, then you’d get an actual AE/AF lock, in which the subject or the camera could move, and the AE/AF would stay in place.

A way to get around the lack of independent exposure controls in the Camera app is to tap the sun icon overlay next to your single-tapped region and drag the exposure up or down to perform exposure compensation relative to the exposure setting the Camera app picked for you. This comes in handy when I take photos of neon signs. You can also get exposure compensation in one of the overlay submenus revealed by tapping the top arrow to expose the bottom row, as you do, with the plus and minus in a circle. Not a sun. (It’s as perfectly logical and consistent as the rest of the interface.)

The problem with the AE/AF lock feature triggered by Camera Control is that it activates a large region of the center of the screen. With a camera, you can set this to be a significantly smaller center area. Basically a cross-hair or just a single phase detection point in the center. Even if you tap and hold on the screen for AE/AF Lock, the region of the screen is much smaller.

If a “subject” is in frame, like a person’s face, the Camera app draws a box specifically around the bounds of their face instead of the larger region box it draws for a landscape or other wide shot. It’s still not a tiny box you’re sticking to a person’s eye, but it does not cast the wide net that the oversized region box does.

The reason the region size matters is that if your subject is layered in depth—let’s say a foreground, middle ground, and background—then you’ll capture some of another layer in what you’re trying to lock instead of just the center-most point. It’s a lack of precision. That’s for both metering for exposure and focus together. Again, for some reason, you can tap, or tap and hold, to get a finer level of control than you can with the thing that has “control” in the name.

A side by side series of two screenshots. They both show the same scene of a living room with a Christmas tree in the background, and MacBook Pro screen in the foreground, with a couch in between. The Christmas lights are warm, and the display is cool blue. The first image shows the AE/AF lock region from the Camera Control. The second image shows the AE/AF lock region, which is 2.5 times smaller.
In these two screenshots, you can see that the Camera Control is going to grab and lock on to a region that is 2.5 times larger than the region you get from tapping. The overlays are highlighted in red to read them more easily against the warm environment.

You can still get to the layered object you want to lock to by moving more broadly to capture only that subject in the large center region, but that’s more effort than tapping and more movement than you’d have to expend using a real camera with a smaller center region since you need to get what you want in that large box.

There are no deep menus to go into to refine the region size or lock only exposure or focus. This is the entirety of the feature enabled by the buried toggle. On or off. Press the button gently, but not too gently. Also move a lot, maybe.

Otherwise, you can simply give up and tap the screen, which anyone with any model of iPhone can do. What a selling point for Camera Control!

This is absolutely where third-party camera apps can fill a void, but then what was the point of doing all this not-so-useful work for the official top-dog Camera app?

Lacking in Visual Intelligence

Apple also included Visual Intelligence in iOS 18.2, and it’s a huge disappointment. The two on-screen buttons always divert you to two different third-party services. If you select Ask, the image will be sent to ChatGPT. If you select Search, it will be sent to Google. There are appropriate warnings for both services, but again, Apple’s vaunted new feature is primarily a quick image upload to a partner.

Two screenshots side by side of the prompts for 'Ask' and 'Search'. The two buttons for each are 'Continue' and 'Not Now'.
We’ve got both kinds. Country and Western.

Other options can be triggered if it detects certain criteria, but it’s pretty picky about it and doesn’t tip you off that it can do more until you press the shutter. Unlike constantly showing “Ask” and “Search.”

In one case, I held my phone up to a yellow warning sign in Spanish, and it offered up a Translate button, but only after I hit the “shutter” button, which isn’t saving the photo, but hitting pause on the input for the software to more thoroughly examine it. Google’s apps and Apple’s own Translate app offer live translations without needing to hit the shutter to pause, but Visual Intelligence doesn’t have that option.

A yellow, diamond-shaped warning sign with the image of a hand being punctured. There is text below that has been translated as 'Danger'.

There is also the option to summarize the text you took a photo of with the shutter button. It’s probably the least likely thing I would want to do, but hey, it’s something the software can do, so why not?

Apple has many other machine learning models for all kinds of image recognition, but only the ones that use optical character recognition are present. I can’t use this to identify a plant, for example. I have to take a photo of a plant, go to the photo in my Camera Roll, expand the photo, thumb the whole thing upwards to reveal the info panel, and then tap on the plant identification option there. The same goes for animal and landmark identification.

Conversely, you can’t use Visual Intelligence “Ask” and “Search” features on a photo that you’ve already taken from inside of the Photos app, like you can use those other features. You can certainly send those images off to ChatGPT or drop them in the Google app. What gives? Why not put the “Ask” and “Search” buttons under every photo? Why not put them in context menus?

Maybe, someday, all of those things will be true, and Visual Intelligence will act as an umbrella for all the image-based models Apple has. Why make a promise about shipping this right now when it’s really not terribly beneficial to anyone —including Apple?

If it was to appear like they weren’t behind (Google Lens shipped one million years ago), then unfortunately the shipping product reveals that they are more behind than they were if it was something in a lab they were still promising. There is a danger that this trains customers that Visual Intelligence is not worth using, especially since it’s so hard to get to.

Dialing back the dial

Speaking of training customers, I’ve reached the point where Camera Control has trained me to turn off the features I keep accidentally triggering. Settings -> Camera -> Camera Control -> Accessibility -> Toggling off both Light-Press and Swipe. I’m not interested in accidentally triggering them, and there’s no reward for trying to do anything with those on purpose.

Apple has not addressed any critiques of Camera Control other than the “we are totally shipping a half-press focus lock” promise from the launch. Anecdotally, most people use it as a Camera app launcher or shutter button that’s easier to reach than the volume-up button. Yay?

I’ll leave AE/AF Lock on for the time being, but the truth is that with the way it’s enabled, it’ll likely return to the default too, and all of that will be for naught. I currently regret that so many of us asked them to give us this, because it was only ever going to be another thing on top of this complicated stack of decisions they already made about what Camera Control is. They can’t take these things away, but maybe they can make profiles, or group them into modes to make the button do less under certain circumstances instead of people not wanting to mess around with it. Perhaps it’s time to exercise some control.

[Joe Rosensteel is a VFX artist and writer based in Los Angeles.]


By Joe Rosensteel

iOS 18.2 Mail is a misfire

Out of all the new features in iOS 18.2, I really didn’t expect that I’d be writing about Mail of all things. And yet, given how many times a day I use Mail on my iPhone, the changes in Mail in iOS 18.2 might be the worst thing about the release.

I’ve used Mail for iOS since I had a first-generation iPod Touch. The Mail team has maintained the app so that it has always functioned well enough, even though it lacked features. Unfortunately, they’ve now begun adding features, and it’s no longer functioning well enough.

Scattergories

iOS 18.2 introduces Categories in Mail, and I think the entire concept is completely incompatible with my brain and my experience with other email clients (including previous versions of Mail for iOS). I’m trying to be flexible and not immediately dismiss new stuff out of hand, but that positive attitude is at odds with my desire not to make something in my life more difficult just because someone else thinks they’ve got a hot, new take on things.

Allow me to explain my boring approach to email: I have several personal email accounts. (Any employer-assigned email accounts have been siloed off in other apps, like Outlook or Gmail.) I leave all my read emails in the Inbox and don’t move things out because the chronological “feed” of email helps me revisit old emails in a way I can’t if I moved them to folders.1

Obviously, this mentality is somewhat incompatible with Apple’s bold approach to Categories. The good news is that each category isn’t a folder that moves my message from my inbox. The bad news is that the interface behaves completely differently depending on how Mail categorizes your email.

The Primary category features messages that the Mail app deemed important. If the app considers emails from other categories time-sensitive, they can also appear here. If an email from another category appears in Primary, it will have a little icon of the category it also appears in.

Primary is the only category used with the Priority feature, which scans messages and marks those of special importance. This feature shipped with iOS 18.1. Under the menu in the upper right, Priority can be toggled off separately from everything else. Priority is not Primary, but it only lives in Primary. Got it?

Primary is also the only category that will show a badge on the Mail icon by default, but you’ll still receive notifications for all messages in all categories.

A screenshot of iOS Mail cropped to show the Primary category and the first-launch explainer stub 'Manage Badge Count. Only unread messages categorized as primary will appear on the Mail icon. Learn More.

Confusingly, the categories that don’t count towards the badge notification or unread mail count get little dots next to their icons when they contain new mail. But that’s the only place you see those dots—they don’t display as a count anywhere, and new also doesn’t mean the same thing as unread (more on that later).

When you’re looking at the Mailboxes view of all your mail accounts and their inboxes/folders, items will only display an unread count if they contain messages that are marked as Primary. If a message is categorized as anything else, the list will appear as if you’ve read all your mail in all your inboxes.

It is my strong personal opinion that the unread count in the Mailboxes view should reflect the unread count of the emails you have in those mailboxes. Suppressing badges so you can focus is unrelated to whether or not a message is unread. I know why people want to hide badges and notifications because they’re drowning in mail, but oftentimes, I just want to see where an unread message lives.

To get a full unread count and return to normal badges, you need to go to Settings -> Notifications -> Mail and then scroll down to Customize Notifications. You can choose to list the unread messages in Primary or all unread messages. That’s it. You can’t exclude Promotions, or only include Primary and Updates. Primary or all are your choices.

The Transactions view supposedly puts all my transactional emails together. Fine. Except, for some reason, my informed delivery emails from the United States Postal Service appear under Transactions. I didn’t buy my mail!

I don’t know why deliveries appear here. Some deliveries are connected to transactions, but not all. Strangely, emails I receive about packages that are out for delivery today are somehow not considered time-sensitive enough to appear in Primary as well as Transactions.

Updates is apparently where some (but not all!) of my newsletters go, along with terms of service updates, flight emails, and explanations of benefits from my healthcare provider. It’s basically everything from a business or business-like entity, that isn’t directly about money or a delivery.

That seems fine, I guess, except that’s a really broad category that has the full spectrum of things that I’d like to know about, ranging from sooner to later. I can move a sender from here to Primary if I want it to alert me sooner, but that’s a powerful all-or-nothing decision that will cover every email that the sender ever sends and has sent. It doesn’t train the system that this kind of email from this sender is important. Choose wisely if you think something is more important than “Updates.”

Promotions is where you’ll find Deals! Deals! Deals! Honestly, this category makes me think this whole thing about my email disappearing into three oubliettes might be worth it. Unfortunately, several of the newsletters I subscribe to are in this category, including the Six Colors newsletter. Perhaps it is overly aggressive in relegating senders to this category—almost like there’s a total lack of precision—which makes me not trust it at all.

For a day, I let messages accumulate in this category and only dealt with my Primary inbox, Updates, and Transactions. That’s what it’s for, right? To keep you from being distracted by these unimportant promotions?

Except when I did finally dive in to Promotions, I had to go through and read the subject and AI summary for each one to figure out if I trusted it enough to mark the email as read or not, or if I really needed to read it anyway. This didn’t save me any time. It’s not like there were hundreds of these promotional emails. The day after that, I started worrying something valuable was in Promotions, so I ended up checking it anyway. Net result: It created more work rather than saving me time.

Group project

The quirky new Grouped Messages feature applies to all categories except Primary and All Mail (where messages can not be grouped by sender, for some reason).

Tapping on a message, like the one I got from the USPS, leads to a view of all messages from that sender. However, the header for that view takes up a large portion of the screen. There’s a header image for the sender, which heavily pads the email address or name of the sender. (It also occasionally lops off descenders on letters, apparently. Guess they needed more room.)

A cropped screenshot of the iOS Mail app showing the grouped sender view for auto-reply@usps.com but all the descenders are cut off at the baseline. There is an enormous amount of padding, but the subject is crammed together and truncated.
It’s hard to believe this is the company that sparked the desktop publishing revolution. Can I please see the subject line? Pretty please?

The subject line of a message is truncated with an ellipsis feature, which is frustrating, given how much space is being wasted by this view. (Why would I need to know what the email is about when I can see a huge circle with a graphic in it and an enormous version of the sender’s name or address?)

Sometimes, the entire message at the end of a group of messages is displayed, and sometimes, there’s a “See more” to expand the message. I got here by tapping on this message from the Category view, why won’t it show me the whole message like if I tapped to get to the message from the Priority or All Mail view? Shouldn’t the last email in a bundle always be fully expanded because it’s the thing I tapped on to open this bundle view in the first place? There’s infinite room below it! This isn’t saving any space!

That truncated view could theoretically be useful if you’re going back up in the bundle and expanding those to find a previous email from the Sender, as if it was a threaded chain of email replies instead of individual messages. Except… when I scroll back up, it expands the entire message for each message fully. I can’t even expand it to just see the full subject line and close it again.

Speaking of closing it again, I can’t find any way to do that other than force-quitting Mail. Tapping the subject stub, which had expanded the item, doesn’t collapse it, but expands the subject stub so you can tap the email addresses. Long-pressing doesn’t do anything. There has to be a way to collapse messages in this view, but I honestly can’t figure it out.

Refusing categorization

If you’re in the truncated, padded bundle view, you can tap the ellipsis button in the upper right corner to change the sender’s category. However, this will change the categorization of every message the sender has sent or will send. Sometimes, a single sender sends me Transactions, Updates, and Promotions. Some companies split these into different email addresses, and others don’t or don’t reliably.

And remember, if you move a sender from Transactions, Updates, or Promotions to Primary, you no longer get the bundle view of the sender. Why? I don’t know. I thought that view was supposed to be helpful?

That also means you don’t get the ellipsis button in the upper right to move the Sender. However, you can tap the Reply icon in the bottom bar of a message, which brings up that ridiculous menu, and scroll down to Categorize Sender to once again change all of the emails from that sender. That’s how email rules and filters should work, right?

People have asked for filters/rules for years for Mail on iOS, and Apple didn’t give them to us… until, all of a sudden, we’ve got a few hard-coded invisible rules that users can nudge a little. We can’t be trusted with Smart Mailboxes or labels, but we do have three immutable categories that all email is supposed to fit into.

Why can’t I make a category called Deliveries and elect for it to be worthy of the unread badge? Why can’t I make a category called Newsletters that’s silently delivered but not lumped in with all non-business related mails and get around to reading them just after I die?

Or you could give up…

Now that we’ve got categories, there’s got to be a way to temporarily not view them. This is the All Mail category, which is hidden off to the right side of the other categories, so you don’t even know it’s there unless you swipe on those categories to reveal it. Surprise!

This view doesn’t expose any of the category features. It doesn’t put little icons next to mail items to let you know those messages are also in Transactions, Updates, and Promotions, like the Primary category does. Why not? It can’t group notifications by sender. It seemingly doesn’t have Priority either since there’s no option in the right ellipsis button menu to turn it off.

It’s basically like temporarily embedding the old List View underneath all the other Category items. Since you can just tap the ellipsis (top right) and List View to leave Categories behind, I’m unclear why the All Mail category also exists. If it worked in a way that also integrated categories, it would make more sense. But it doesn’t.

(If you’ve got multiple email accounts and you leave the All Inboxes view, you’ll find that your individual Inboxes have Categories of their own too. That’s fine, but Mail will also remember the category you were last in in that particular inbox. If you jump from All Inboxes, All Mail to your Gmail Inbox, it might be set to Updates, and you might not see an expected message because you have to remember that the category isn’t a setting that persists as a view across all inboxes when you move around the interface. It can be quite disorienting.)

The escape hatch for all of this categorical goodness is simply to give up and return to the classic List View. That should really be the default instead of this first attempt at Categories.

Think back to major changes in email clients you’ve used over the years. Almost all of them have been controversial because people are used to the current way of doing things. Their life, or their job, isn’t about learning new approaches to this mundane necessity.

That is why they nag users of the old version to try the new version while reassuring them they can go back. Over time, this becomes a matter of sticking them in the new version and telling them they can go back to the old version. Then, the new version becomes the only version. Rinse and repeat.

Apple skipped the opt-in step, which I suspect will engender far more ire than had they gone the traditional route. There will always be people who are resistant to any change, but this release strategy isn’t helping. What’s worse, there’s no universal “go back” switch people can flip. Some of the feature toggles are in Settings, and some are in hidden menu buttons inside the Mail app.

I had to walk my boyfriend through the steps to get List View back, turn off Priority, and to turn off all summarization because he hated it. I’m sticking with it for a little while longer, but I don’t know if I’ll make through Christmas before bailing.

As iOS 18.2 rolls out more widely, people are going to find themselves challenged by having to change their years-old or decades-old habits for an email client that thinks your package delivery for today is a non-time-sensitive transaction, and your newsletters are promotions. I’m not convinced people will have a ton of patience to try to calibrate this system, and adjust themselves to it. (And let’s not forget, these features still don’t exist on your iPad or Mac, eliminating one of the advantages of using the same app on different platforms.)

Mail on iOS is pretty important to Apple’s plans for on-device intelligence. A lot of what an AI can know about you is gleaned from your text messages, calendar appointments, and most importantly, email. If people ditch Mail out of frustration, it undermines the value proposition of the iPhone.

I hope that Apple moves quickly in the new year to correct some of these issues. In the meantime, all of us expert users will get another Christmas in the trenches helping loved ones figure out how to get Mail to work the way they expect it to.


  1. You will not change my mind. I do not need tips. I don’t care about your system. Leave me alone. 

[Joe Rosensteel is a VFX artist and writer based in Los Angeles.]


By Joe Rosensteel

How Does Clean Up Measure Up?

In my previous post for Six Colors, I wrote about why Apple’s Clean Up (and photo retouching tools in general) were a fine tool for people to have in their photo editing toolkits.

Now that Clean Up has been released in iOS 18.1 and macOS 15.1, I’d like to go over some technical things I’ve noticed while using it, and seeing the ways it is similar and different from some other photo retouching tools—like the ones I used in the previous post.

Those apps include the iOS version of Photomator which was recently acquired by Apple, TouchRetouch by ADVAsoft, and Adobe’s Lightroom mobile app.

Enjoy watching me scribble with my finger!

[Joe Rosensteel is a VFX artist and writer based in Los Angeles.]


By Joe Rosensteel

You can use Clean Up with a clear conscience

Next week, the first round of Apple Intelligence will be loosed on the general public, including the Clean Up feature in Photos that lets you alter images to remove unwanted elements. This is not a new feature in photography—in fact, Photos is probably the last photo utility in the world to get a feature like this.

But that won’t stop some very loud, reactionary voices complaining about Clean Up as if it were the end of the world. And of course, as with any high-profile Apple announcement, there have been media reports that purposefully try to take features like Clean Up to extremes far beyond what anyone would reasonably do. The approach that leads to headlines like “I only ate peanut butter for a week!”

Last year, people were starting to get very existential about image editing because of the first version of Google’s Magic Editor, and everyone suddenly became concerned that Apple’s image pipeline was getting too over-engineered. People should really have not gotten so hung up on what even is a photograph, maaaaaan.

I first wrote about this last October, but this time, I feel like I need to be less philosophical about it and a lot more direct.

If it pleases the court

The photographs you take are not courtroom evidence. They’re not historical documents. Well, they could be, but mostly they’re images to remember a moment or share that moment with other people. If someone rear-ended your car and you’re taking photos for the insurance company, then that is not the time to use Clean Up to get rid of people in the background, of course. Use common sense.

Clean Up is a fairly conservative photo editing tool in comparison to what other companies offer. Sometimes, people like to apply a uniform narrative that Silicon Valley companies are all destroying reality equally in the quest for AI dominance, but that just doesn’t suit this tool that lets you remove some distractions from your image.

Clearly, companies like Meta which posted on Threads that people could use AI to fabricate their images of the northern lights so they wouldn’t feel left out, are up to entirely different shenanigans. Sure, that mushed-together image isn’t courtroom evidence either, but morally and artistically, what is even the point of a fake image of the northern lights posted to social media?

This is where everyone with a computer engineering degree starts saying, “But, but, but…” Because they are uncomfortable with any kind of ambiguity. How can removing a distraction from the background be ethical when hallucinating an image of the northern lights is not? Aren’t they all lies? Through the transitive property, doesn’t that make them both evil?

Yes and no. (Indistinct grumbling.) Ethically, what is the subject of your photo? Who is the audience for the photo? What do you want to communicate to the audience about the photo?

If the subject of the photo is my boyfriend, the audience is the people on Instagram who follow my boyfriend’s private Instagram account, and the thing that he wants to communicate is that he was in front of a famous bridge in Luzerne, then there is no moral or ethical issue with me removing the crossbody bag strap that he had on for some of the photos I shot.

I took the photo, composed with him in the center, as is the way he likes these things composed, and then he remembered he had the bag on and didn’t want the bright green strap. He did move and wanted different framing, though that I didn’t feel was as good as the first shot. I told him I thought the other one I took with him and the strap looked the best for the narrow 9:16 Instagram Story framing, and he agreed, but he wanted the strap removed.

Three side-by-side comparison images. All three images are of Joe's boyfriend, Jason, smiling in front of the wooden Chapel Bridge in Luzerne, Switzerland. The first image has wider framing and no bag strap, but the composition is weird with the deep blue sky over the clouds being distracting and the bridge appearing smaller. The second image has a better composition, but he has a green strap across his chest. The third image is the second with the strap removed.
See, that composition on the one without the strap just isn’t as good. However, he didn’t like the strap in the one with the strap. Problem solved with editing.

This was before the release of Clean Up, so I fired up Pixelmator on my iPhone, removed part of the bag with the retouching tool, and then copied and transformed the shoulder and part of the shirt collar from another image. Certainly not as easy as Clean Up, but things like his shoulder are genuine images from another slice in time instead of total reconstructions using only the image being edited as a source (I feel like this is a shortcoming of Clean Up and would like a 2.0 that can source from patterns in surrounding photos, but I digress.)

The point is that yes, the image is no longer courtroom evidence, but courtroom evidence of what? That he never wears bright green bag straps? Who would care about such a thing? Certainly not the audience of people who follow his private account on Instagram that just like to see a photo of him smiling in front of some bridge in Switzerland. That’s exactly what the photo was.

Morally, I’m totally fine with all that. He was at the bridge. He did, at one point, not have that strap on his shoulder. I wasn’t removing a tattoo. I didn’t fabricate a different background for the photo.

“But, but, but!” Yes, I know, it’s not 100% what happened all in that same sliver of time. “The bag strap is part of the moment!” Yeah, but there were all those photos where he’s holding it below the frame, off his shoulder. No one is going to argue that I should have framed the shot to include him holding the bag for truth. Why would they?

For some reason, even the most literal of literal people is fine with composing a shot to not include things. To even (gasp!) crop things out of photos. You can absolutely change meaning and context just as much through framing and cropping as you can with a tool like Clean Up. No one is suggesting that the crop tool be removed or that we should only be allowed to take the widest wide-angle photographs possible to include all context at all times, like security camera footage.

A side-by-side comparison of two photos. On the left is the unedited photo showing Joe's boyfriend, Jason, smiling at a table with a beer in hand. A copper still is behind him. There is a water bottle and a green bag strap by his screen right elbow. The second image is the edited and cropped version where the bag strap is cropped, and the water bottle has been removed.

Another example from that day in Luzerne was when we got lunch in a neat brewery by the river. He had a big copper still behind him, but he also had that dreaded green bag and my reflection in that still. I just cropped it. It was the simplest solution. However, he did have a water bottle that I removed with a retouching tool. Is that different from cropping out the bag? Again, is there some court case about water bottles or bag straps? No. No one would care. This is for the people who follow his Instagram Stories. Crop it, and use Clean Up; it’s ethically equivalent.

Artistic considerations

I will provide two counterpoints for when not to use Clean Up that has nothing to do with morality, just to show that there are other artistic considerations. If you have a photo that has a crowd of people in the distance at a landmark, then leave them alone. Those indistinct clumps of people provide scale for the landmark and a sense that you’re not traveling in some world devoid of humanity.

Not every person in the background of a photo is a candidate for removal. You don’t want to be at a haunted beach or a waterfall that could be 2 feet or 200 feet tall. If one bozo has a highlighter-yellow fanny pack, then sure, remove, or selectively desaturate that in Pixelmator or Lightroom. (Gasp! More lies!)

The other time to not use Clean Up is when you have some overlapping areas of high detail behind, or in front, of what you’re trying to remove. Tools like Clean Up, just like all other retouching tools, work best when the thing you’re removing is fairly isolated and distinct, with a very indistinct area of fill behind them. If you’re trying to remove a guy standing in front of a tapestry, then it’s probably not going to go very well. If the foreground subject matter you’re keeping has long hair blowing in the wind, then the bozos behind that hair are not going to be removed cleanly. Wait until they at least walk to the screen left or right of the hair.

People can understand these limitations and use them to make creative choices while they’re framing their shots. If there’s a bozo that’s standing in front of a wall, and they’re just not going to move any time soon, then get a shot where he’s near the edges of your foreground subject (it’s a digital camera, so take a bunch of shots) and then you can have an easier time removing them. Also, things like Portrait Mode (more lies!) can help, especially since Portrait Mode has substantially improved its image segmentation and edge detection. That blurry bozo is even easier to fill in with blurry background than detailed background.

Above all else, remember that if it’s just a bad photo, then it’s just a bad photo. You can keep it for yourself instead of sharing it or trash it if you prefer. Even with every photo-editing tool under the sun, they can’t all be winners.

Don’t get it twisted

Like I said earlier, this is about common sense, and if, upon some introspection, the thing you find alarming is that you don’t know how to ethically use this tool, then it’s totally fine if you don’t use it.

However, I don’t want to see silly, sweeping statements from people that foist their anxieties based on their ignorance onto other people. I don’t want to see all image editing tools lumped together with one another, or worse, with every other thing that has “AI” in the name. These tools are not all the same thing. These photos aren’t all the same. Use your brain and not some puritanical binary rule to lump all edited photos together. Let people have photos that they like!

[Joe Rosensteel is a VFX artist and writer based in Los Angeles.]



Search Six Colors