Six Colors
Six Colors

Apple, technology, and other stuff

Support this Site

Become a Six Colors member to read exclusive posts, get our weekly podcast, join our community, and more!

By Jason Snell

In iOS 18, Photos brings Collections to the fore

Left to right: The new initial Photos view, which combines the Library (top) and Collections (bottom); the new Collections area provides ample opportunity for discovery; each Collection comes with a Movie and a curated selection of photos.

One of the biggest changes to Apple’s devices this fall is the release of a new version of Photos, one that contains some pretty major interface changes—especially on the iPhone and iPad. I’ve spent the summer working on a new edition of my book about Photos and so I’ve had a lot of time to think about what Apple’s trying to do here.

Put simply, Apple continues to grapple with the fact that the Photos app serves two very different purposes.

On the one hand, the app is the definitive media library for people using Apple’s ecosystem of devices, as represented by the Library view. This is the app you go to to find an image you’ve captured, whether it was 10 seconds or 10 years ago. It’s a reference library and a utility, and if you need to quickly grab a picture you just shot in order to send it somewhere, you need to be able to do it quickly and easily.

But on the other hand, for a decade Apple has invested a lot of effort into making Photos a vehicle for discovery of the amazing stuff that is contained in those same voluminous media libraries. Apple has been using machine-learning technology, suddenly the talk of the tech world, for a decade in Photos, identifying the people and objects in photos so that it can better understand what it’s got. And then it has spent even more work trying to build systems that can organize and recommend those photos and videos to you, the person who took them.

Back when I started taking lots of digital photos (in the early 2000s), there was no way to do this. You had to go through your photos and apply tags or keywords to each one manually. Organization was by time stamp alone, and geotagging was an unheard-of concept. Thanks to GPS stamps on smartphone photos, and the machine-learning stuff in Photos, it’s now child’s play to say “show me pictures of my kids in Hawaii” and get hundreds of images within seconds.

And yet I think Apple realized that since the Photos app always launched in Library view, a lot of people had no idea that behind the scenes, Apple had built an entire curation system that was designed entirely to delight users with pictures and videos of loved ones from across decades of history. And, really, what’s the point in building a giant media library if you never revisit those photos and videos?

So the new version of Photos doesn’t launch to the Library view, with a bunch of tabs at the bottom that apparently few people clicked on. Instead, it launches to a new hybrid view (thankfully simplified and tweaked since the original iOS beta earlier this summer) that displays the familiar Library grid in the top two-thirds of the screen, with a series of Collections in the bottom third. When you scroll up, you’re in classic Library view. When you scroll down, you’re seeing the multitude of ways that Photos can automatically carve up and re-serve you the contents of your Library in ways that make sense and are pleasing.

I know that these changes made a lot of people cranky this summer, but I think the app ended up in a great place. Sure, if you are someone whose idea of using Photos is to launch it and only see the very latest items, I guess this update adds clutter. And Apple should probably let people say “I don’t want to launch in this view” and honor that request. But for the vast majority of iPhone users, Collections are a boon, a way in to your library that offers major improvements over long scrolls through the Library.

The default top Collection, Recent Days, is remarkably utilitarian: it’s literally a way to quickly browse through recent days of images and videos, so you can jump back to that thing you shot three days ago. And then there are Collections featuring people and pets (and, for the first time, user-definable groupings of those people and pets), Photos-generated Memories and Featured Photos, collections of trips you’ve taken (applying dates and geotagging information in a smart way), and even suggestions for good images to use as wallpapers.

In past versions of Photos, Apple focused a lot of its energy on auto-generated collections called Memories, which collected together photos and videos on somewhat random subjects (“at the park!” “pet friends!”) on a somewhat random basis. They were pretty sophisticated collections, offering an automatically generated slideshow movie and a curated view of photos rather than every single item that matched the theme.

Memories still exist, but what Apple has done is taken the entire Memories concept—a slideshow movie and a curated collection of items—and applied it to every single Collection in the Collections view. So whether you’re looking at a Recent Days collection of last Tuesday, a Trips view of your summer visit to the shore, a People & Pets group, pretty much whatever, it ends up in a Collection view, complete with that slideshow movie. It’s a pretty rich collection of stuff, and just as important, it’s a consistent interface, which Photos lacked before.

Apple has also boosted Search in Photos again. You can still search for keywords and locations and dates, and it’ll work. But if you just want to natural-language search for “Julian at the beach in Oregon,” that’ll do the trick. It works really well.

Clean Up will be very useful, once it ships with Apple Intelligence.

As with everything else this fall, the other shoe to drop involves Apple Intelligence. Photos will gain a feature where you can just type a suggestion and Photos will build a Memory of that for you, on demand. But more importantly, Photos is finally getting proper background removal support across iPhone, iPad, and Mac with the introduction of Clean Up, another Apple Intelligence feature. In general, it works really well. I just wish Apple hadn’t waited until 2024 to launch this feature—and limit it to devices powerful enough to run Apple Intelligence—since you could use the Photomator app to do this on the iPad five years ago.

It’s a consequential year for Photos on iPhone and iPad. (The Mac is updated too, but there’s no real interface overhaul—you can get to Collections via the Sidebar, as always.) I think I understand Apple’s moves here and I support them. Yes, people who prefer Photos to be an utterly utilitarian look into your camera roll will probably feel that Collections is poking its nose in where it doesn’t belong. But the benefits of showing off the contents of the Library in more creative ways than a grid of your most recent entries are just too powerful to be ignored.

Apple’s design changes will mean that more people will discover great stuff in their Library. That’s the whole idea. I think it’s the right thing to do.

If you appreciate articles like this one, support us by becoming a Six Colors subscriber. Subscribers get access to an exclusive podcast, members-only stories, and a special community.


Search Six Colors