Six Colors
Six Colors

This Week's Sponsor

Magic Lasso Adblock: YouTube ad blocker for Safari


By Jason Snell

WWDC 2023: More responsive iOS camera apps

An apple employee doing a skateboard jump on the session video.

Usually WWDC week is for new OS features. But this year, WWDC was stuffed full of new Macs and an entirely new Apple platform, so I’m slowly digging out of that and starting to dig into the new OS features announced last week. In the “Create a More Responsive Camera Experience” video, there are some exciting new features that should improve the experience of taking iPhone photos.

A lot of the secret sauce Apple uses to generate iPhone photos involves taking multiple images and then fusing them together. The most high-profile of these features is Deep Fusion (often jokingly referred to as “sweater mode,” referring to Apple’s demo images of detailed images of the weave in a sweater). Deep Fusion can generate great results, but it takes time to run. If you take a Deep Fusion photo, you may end up having to wait before taking your next shot—and you might miss something great in the meantime.

In iOS 17, a new deferred photo processing feature allows camera apps to push off image processing until after your camera session is complete. The result is that the shutter button becomes active almost immediately, so you can take more pictures—which is a good thing. The system saves a temporary, unprocessed image to your photo library as a placeholder, and when your phone is no longer busily shooting photos, it will fuse the image captures in the background and then replace the proxy with a full-fledged Deep Fusion photo.

The other update to camera-capture features in iOS 17 is designed to reduce shutter lag, which is the unfortunate effect where your camera captures an image a few fractions of a second after you pressed the shutter button. The iPhone camera is capturing images at 30 frames per second and can use multiple images to fuse together something nice, but it can’t go back in time—or can it? In fact, in iOS 17 the camera buffer can capture all the time during a shooting session, so when you press the shutter, it is able to capture the moment you intended—and use previous frames to help generate the final images, not just future ones. The result, according to Apple, is true “zero shutter lag.”

The iOS 17 responsive capture and fast capture features will work on iPhones with an A12 Bionic chip or newer.

If you appreciate articles like this one, support us by becoming a Six Colors subscriber. Subscribers get access to an exclusive podcast, members-only stories, and a special community.


Search Six Colors