Six Colors
Six Colors

Apple, technology, and other stuff

Support this Site

Become a Six Colors member to read exclusive posts, get our weekly podcast, join our community, and more!

By Dan Moren

At WWDC 2025, Apple played to its strengths

Tim and Craig take the stage at WWDC 2025

Another Worldwide Developers Conference is in the books, and after a week of keynotes, briefings, and travel, I’ve finally had a chance to sit and zoom out to the 35,000-foot view of the company’s latest announcements.

The Apple of 2025 has definitely learned some lessons.

In hindsight, last year’s event has seemed even more rocky, with the company hustling to unveil Apple Intelligence, including showing off features that still have yet to ship. To its credit, it avoided doubling down on those mistakes with this year’s announcements without fully repudiating its previous steps. Instead, the company went back to focusing on the assets that make it the best at what it does. In other words, the ones that let Apple be Apple.

Continue reading “At WWDC 2025, Apple played to its strengths”…


By Joe Rosensteel

tvOS 26 brings minor additions and weird priorities

Note: This story has not been updated for several years.

A screenshot of the TV app in tvOS 26 beta 1 showing a splash screen image for WWDC 2025. The text overlay on the image refers to it as a 'Movie - Special Interest'
They should have given it a theatrical run.

Apple has largely tied major revisions of tvOS to the launch of new Apple TV hardware over the years. Since the introduction of Apple TV+, WWDC’s tvOS “features” have largely focused on showcasing sizzle reels of Apple TV+ shows, and very little about tvOS itself. This WWDC gave us a trickle of announcements that don’t seem to align with what I would consider to be the rough spots in the tvOS user experience.

It is possible that Apple is holding back meaningful revisions until they launch an updated Apple TV box this fall. Maybe they’ll even mention the 10th anniversary of tvOS itself, which was unveiled in September of 2015 at the iPhone 6S launch event. Until then, I guess we should reflect on what’s announced, instead of wish lists of what could be.

Through a glass, darkly

I’m not going to rip into the design in beta 1. It’s also mostly a conservative evolution of what came before but with highlights on edges. However, Apple has really underscored a very specific part of the interface as working as intended and I will push back on that.

Apple has two kinds of Liquid Glass (Regular and Clear) and Clear is supposed to be used over rich media, like video. The only things that define the existence of the controls are the highlights and brighter/blurry refractions visible through the clear elements.

Well, gee whiz, aren’t clear glass playback controls going to be difficult to see over video, especially when it’s playing through the controls?

To make the controls easier to discern, Apple applies a dimming layer on everything around the controls, but not on the video visible through the controls. It’s like someone stenciled out aftermarket window tinting.

Apple says this is on purpose in its Meet Liquid Glass WWDC video, when demonstrating playback controls on iOS. In its Newsroom post for tvOS, it says: “tvOS 26 is designed to keep the focus on what’s playing so users never miss a moment.”

This is bananas. How is this getting out of the way of the content? You can barely discern the playback timeline and playhead while motion is occurring through the element, which causes it to pulse in a thin strip. What is being achieved here? The playback controls and timeline should be flat. No one is going to feel sad that there’s no glass effect in this one spot, where it serves no practical or artistic purpose other than being a wicked smart shader demo.

Poster through it

Another notable change in the interface is the pivot from horizontal thumbnails to portrait-orientation posters. Apple says that this means more tiles can fit on the screen, but that’s only more tiles visible in one row, and it’s only one additional tile over the smallest scale thumbnails (6 posters instead of 5 thumbnails). The older design had thumbnails that matched the aspect ratio of the TV in various sizes so you’d get more rows with fewer titles visible on screen in each row.

To compensate for this difference in aspect ratio, the text that was below or next to the thumbnails is now on top of them. I’ll let readers debate which is more legible, and whether or not the text is always helpful.

tvOS 18.5 (left) versus tvOS 26 beta (right).

This decision pushes content downward. If you want to see what kind of category you’re in the mood for, you will do more scrolling down, which means it will take you longer to count the number of times the TV app recommends you watch “Stick.” Unless you really want to flip through one particular row of the interface one title faster, it’s not really an improvement.

Used any good profiles lately?

I’m unclear about the continued push by Apple to get developers to adopt Apple’s user profile system. It really doesn’t provide any benefit to the developers of these large streaming services that need to have their own multi-platform profile systems with personalized content recommendations, and it doesn’t provide substantial benefit to households with shared viewing.

A screenshot of the tvOS profile selection screen. It shows a user profile for Joe and a user profile for Jason with a semi-transparent '+' over the corner of Jason, and another '+' next to that. At the bottom of the screen the 'Don't show this screen again' button is highlighted.
Someone had the forethought to include this button in beta 1.

I have no animosity towards user profile improvements whatsoever, and I do appreciate that on your first boot of tvOS 26 you can say you never want to see the profile switcher. However, system-level user profiles just don’t feel like the area of the TV viewing experience that needs this much attention when compared to other aspects.

If I were being generous, I could hypothesize that this emphasis on user profiles is because there will be some genuine effort put into personalizing the TV app based on the active user profile.

Unfortunately, you still can’t express any kind of preference in “personalized” areas of the interface to mark a recommended show as watched (without first adding each title to your Watch List and then marking it there) nor can you express that you have no interest in a title.

Even if increased personalization is on the horizon, there’s no reason to expect that to work as well as the personalization offered in each streaming app’s own recommendation systems. Such a thing requires developer participation and cooperation with Apple.

Speaking of developer participation…

Just keep adding single sign-ons until one of them works

The 10th anniversary of Single Sign-On is next year, so we’ll be celebrating this latest attempt a little early. That first attempt used a convoluted system to recognize your cable provider to authenticate all the individual apps you had that worked with existing subscriptions so you wouldn’t have to sign in. Just 18 months later Apple announced zero sign-on, where if you were on a qualifying provider’s internet network, the apps would authenticate on their own.

It’s safe to say that these systems almost immediately became obsolete because they were centered on a business relationship between customers and service providers that was in quick decline. Apple’s blind spot here was believing that anything not subscribed to via a cable provider would be subscribed to via Apple. Due to Apple’s App Store policies on subscriptions, many streamers have left the App Store behind. That means people have to do little sign-on dances that makes using Apple products as frustrating as cheap streaming hardware.

Instead of repairing its relationships with streamers, it’s providing this very latest sign-on feature, which links accounts via your Apple Account email address… but requires streamers to want to implement it. I hope they do, and I hope it works to make everyone happier.

Sing out loud, sing out long

I find myself scratching my head at the announcement regarding using iPhones as microphones to do Apple TV-mediated karaoke.

Look, this feature won’t hurt me, or cause harm to the world—with the possible exception of those within earshot—but it’s such a niche thing to do. I have to imagine that someone took a look at the collection of technologies that Apple had built and realized they could put them together, you know, for fun!

I hope people who use this feature do have fun. But it’s a strangely specific thing to use as a selling point, when there are other use cases for the Apple TV, such as watching television, that might be better places to focus.

Give me more

I want tvOS to improve, and am frustrated when another WWDC comes along and the changes are as minor as they were this year. I hold out some hope that there’s more to announce, and it’s being held back on for a new Apple TV hardware announcement. But for now, we’ve got tvOS 26… and it cuts down on information density and creates make see-through timelines.

tvOS needs to sort out the dichotomy between the home screen and the TV app. The current TV app is a mess and needs to be upgraded to support features that Apple has never taken a single pass at, like a universal live guide. I don’t expect them to be perfect, but it would be nice if we could see that Apple is making an effort. Change is long overdue for a platform that many take for granted. Apple needs to try harder at the TV part of tvOS.

[Joe Rosensteel is a VFX artist and writer based in Los Angeles.]


Our interest in drones and how we’d use them, household standards for wearing Bluetooth headphones, whether we install OS betas and on which devices, and the rare app or service where we don’t mind seeing ads.


Federico, Federighi, and iPadOS 26

Federico Viticci of MacStories got to sit down with Apple’s Craig Federighi to discuss iPadOS last week, which is a tremendous thing to see, and you can read all about it:

After all the talk over the past 15 years about the “post-PC era”, why have we come full circle to reusing features and UI metaphors that the Mac got right decades ago? I ask Federighi about this. “When you’re designing in a new space with a new set of constraints with a different kind of user in mind, you do guard yourself against whether it would be too easy to just pull the old thing off the shelf and put it here because maybe that feels right, because we’ve lived with it since 1984”, he begins, acknowledging the Mac’s key role in democratizing graphical user interfaces and freeform windowing. “And you ask yourself”, Federighi continues “’Well, but what is the essence of iPad? And if that other world had never existed and one had designed from first principles for a touch-first device…what would a cursor be like? What would windowing be like?’”…

“At the same time, you have to not be allergic to learning from the past”, Federighi adds. “I think the balance we’ve landed on now is saying, ‘Listen, in the case that the right answer for iPad is a consistent one with another device, the Mac, then, of course, let’s use it. But let’s not reach for something on the Mac reflexively, just because it’s there’”.

The Federighi quotes are interesting and Viticci adds a lot of useful context, but don’t miss the fact that this article is also Viticci’s in-depth post-WWDC brain dump of his reaction to the massive changes in iPadOS 26. It’s basically two articles in one, either of which would be a must-read.


By Jason Snell

Apple keeps checking items off my Mac wishlist

clipboard history in macOS Tahoe
Clipboard history in Spotlight on macOS Tahoe.

A couple of years ago, I recalled that in the early days of Mac OS X, I built up an entire array of utilities that allowed me to use my Mac just how I wanted it. I felt utterly naked on a Mac without LaunchBar, for example. But in the intervening two decades since OS X’s early days, Apple has just kept improving the base features of macOS to the point where most of my old “must-have Mac utilities” had become ones I kept around more out of habit than necessity. And in some cases, I’ve stopped using old favorites entirely because Apple’s built-in tools did the job. That’s good, because a new Mac user shouldn’t need to install a half-dozen utilities to get about being productive.

WWDC 2025 has made me revisit this same subject, because it turns out that the two biggest limitations of default macOS productivity that I saw back then are both addressed in macOS 26:

Many apps can act as clipboard managers—I’ve been using the one in LaunchBar for years, and Pastebot is a popular favorite—and once you use a clipboard manager, it’s hard to go back to Apple’s concept, unchanged in nearly 40 years, that there’s a single clipboard and once you copy something new, the old clipboard is gone forever. I now reflexively copy multiple items in one app and then paste those items into a different app rather than doing the old back-and-forth. I rely on the clipboard history to dig out an item from half an hour ago without having to look it up again…

Another area of interest is file management and automation. I recently wrote about how Folder Actions is somehow still a thing in macOS. Think about offering users the ability to select a folder in Finder or Files and build actions that would occur when those folders changed. Folder Actions enabled some of that, and utilities like Hazel have taken it to the extreme. Sure, power users can run wild with features like this, but I think regular users might appreciate being able to say, “When a file in this folder is older than 60 days, file it away somewhere else,” or “Delete all the disk image files in my downloads folder older than 60 days.” There’s something there.

In macOS 26, there’s a built-in clipboard manager that can be accessed from the Spotlight interface, and a new set of Shortcuts triggers let you run automations when events occur on your Mac or at specific intervals.

I’m sure there are still corners of macOS that could benefit from new features from Apple—there are always new frontiers—but I’m struck by the fact that two of the most glaring areas for improvement have been directly addressed in macOS 26 Tahoe. I can’t wait to spend more time with it in beta this summer.


by Jason Snell

Apple’s built-in transcriber blows away Whisper

John Voorhees of MacStories took Apple’s new Speech framework, available to all developers, for a spin in the macOS 26 beta and got great results in making audio transcripts:

It’s still early days for these technologies, but I’m here to tell you that their speed alone is a game changer for anyone who uses voice transcription to create text from lectures, podcasts, YouTube videos, and more. That’s something I do multiple times every week for AppStories, NPC, and Unwind, generating transcripts that I upload to YouTube because the site’s built-in transcription isn’t very good.

I’ve been using OpenAI’s open-source Whisper system (mosty whisper.cpp) for a couple of years, and while it seems to be more accurate than Apple’s model, it’s also half the speed of the large-v3-turbo model I’ve defaulted to lately.

It’s great to see that Apple is in this game, and even better, that it’s handing the power of this model to app developers so they can built speech-to-text transcription features directly into their apps.


Zaz splits WBD, the neverending Paramount sale, listener letters, and TV picks! [Downstream+ subscribers also get: Apple TV stasis, Clooney on CNN, and the Great Netflix User Migration.]


The Summer of Fun begins with loads of WWDC follow-up, including some big-picture reflections on last week and a discussion of some of our favorite features in Apple’s beta OS releases.


By Glenn Fleishman

Reducing suboptimal iCloud Photos storage sizes

Glenn Fleishman, art by Shafer Brown

iCloud Photos requires mysterious files and processes and often consumes huge amounts of storage space. Six Colors subscriber John writes in with a question about one aspect of this:

I have photos set to optimize storage, and it’s currently (according to DaisyDisk) using 60GB of my 500GB MacBook Air.…But mediaanalysisd is also using 60GB – I understand that’s one of the processes that run when idle, but should it be using that much space, and can I restrict it?

The tl;dr answer is: make sure you’re running macOS Sequoia 15.3 or later. A bug in 15.2 apparently caused this problem for many people. If you’re already on 15.3 or later, I provide more instructions at the end of the article for deleting the cache and the consequences.

Let’s break down how iCloud Photos manages your storage, particularly when you don’t want to store full-resolution images and videos on your Mac.

All those moments will be lost in time, like tears in rain

Photos stores nearly all its data inside a macOS package named Photos Library by default, with a .photoslibrary extension. This package contains a number of folders that allow Photos to perform tasks like retaining your original image and recording modifications that can be reverted later. This structure isn’t designed for humans, but as an efficient way for the app to manage, display, search, and organize media.

If you enable iCloud Photos in Photos > Settings > iCloud, you have two choices presented:

  • Download Originals to this Mac
  • Optimize Mac storage
Screenshot of Photos iCloud preferences in macOS
You can choose whether to store full-resolution or optimized media on your Mac with Photos iCloud settings.

(Please ignore Apple’s capitalization. It drives me bonkers.)

I have advocated for a long time for people to devote enough storage on their Mac—or one of their Macs if they have multiple—to store the entire downloaded Photos library. Because iCloud storage of Photos is a black box, there’s no good way to interact with your files backed up there. Without owning a full-resolution local copy that you can archive and update via Time Machine, another local backup option, or cloud backup (and preferably a combination of those), you could find yourself reliant on the iCloud copy.

While Apple has been rock solid with iCloud storage for years, you can have other problems with relying on it: you could be locked out of your account, because of a reason Apple won’t inform you of. Or, you could have a catastrophic set of system failures or equipment loss—such as in a fire or natural disaster—or even an issue with password and security code record keeping that leaves you unable to prove ownership. If someone hijacked your account, they can also delete your media using iCloud.com or through a locally synced copy, although that’s less likely.

With my warnings noted, optimizing iCloud Photos storage can be quite effective when you have an enormous library relative to your local storage. I have a 2 TB SSD attached to a Mac Studio to keep my 800 GB Photos Library from filling the internal 1 TB drive. However, I have optimization enabled on my MacBook Pro, because that laptop’s 1 TB internal drive means I lack the storage to keep the whole thing. Question-asker John’s library occupies about 1 TB on iCloud.

Let’s dig into optimization.

Uncontrollable purging

Photos optimizes storage by retaining a thumbnail and other metadata about an image or video but dumping the full-resolution media file, which is retained in iCloud. If you double-click, edit, export, or otherwise preview an image or video, the file is downloaded, offering a circular progress completion graphic in the lower-right corner to let you know something is happening. On sufficiently fast networks, you rarely have to wait except for large videos.

Apple uses optimization in a few places, including iCloud Drive (System Settings > iCloud > iCloud Drive), and uses the same philosophy in each case. macOS has some background monitoring to avoid your drive filling up to 100% and rendering it unusable. (This doesn’t always work.)

When some unspecified threshold is reached, various background daemons that are set to optimize can kick in. With iCloud Drive, the least-used files are dumped from local storage first.

With iCloud Photos, however, Apple appears to be extremely aggressive. I can’t think of a single case in which I’ve had optimization enabled in which the optimized Photos Library is more than a range of 10% to 20% of the full library size. On the above-mentioned MacBook Pro, my Photos Library is under 40 GB of local storage.

Screenshot of Manage Storage from macOS System Settings showing a bar representing in green storage in use and individual apps listed below, including Photos.
You can see the storage that iCloud says your synced Photos Library takes up as part of your iCloud or iCloud+ storage.

(Photos and iCloud are often very erratic about reporting storage and quantities of media. For instance, while the Photos Library is about 800 GB with full-resolution downloads configured on my Mac Studio, iCloud reports it takes up 670 GB online. I’m unclear what extra material makes up a 130 GB difference.)

Having that data in mind, we can circle back to John’s primary question—I am a master of expositional sidetracking—about why the daemon mediaanalysisd and its associated storage are sucking up so much data.

The mediaanalysisd agent’s job is to process images in the background during low-activity periods to perform facial (person and pet) and object recognition. (Apple notes, “Face recognition and scene and object detection are done completely on your device rather than in the cloud.”)

You can see where this daemon stores its thinking by going to the Finder, choose Go > Go To Folder, and enter:
~/Library/Containers/com.apple.mediaanalysisd/
Data/Library/Caches/com.apple.mediaanalysisd

On my full-resolution-storing Mac Studio, the folder there takes up 205 MB; on the MacBook Air with optimization on, a bit more at 386 MB. As I mentioned at the outset, people have reported for a few releases that this file will grow uncontrollably, but Sequoia 15.3 and later appear to have stanched that.

You cannot turn this agent off without going through some command-line hoops, many of which are reversed when you install the next macOS update. However, because this is a cache, you can opt to delete it if the storage is causing problems.

Causing amnesia through file deletion

If your com.apple.mediaanalysisd folder is bulging like a failed lithium-ion battery, here’s what you can do:

  1. Quit Photos.
  2. Drag com.apple.mediaanalysisd to the Trash.
  3. Empty the Trash.
  4. Restart your Mac.

Deletion may cause Photos to start over with analyzing people, pets, and things. It might result in the folder swelling back to its original size! But given the reports online from people starting in Ventura, and particularly with people after upgrading to macOS 15.3, this is the best path forward.

For further reading

Our very own chief, Jason Snell, has the definitive book on the topic, Take Control of Photos, where you can find oodles of information and step-by-step instructions on working with the enormous beast that is Photos.

[Got a question for the column? You can email glenn@sixcolors.com or use /glenn in our subscriber-only Discord community.]

[Glenn Fleishman is a printing and comics historian, Jeopardy champion, and serial Kickstarterer. His latest books are Six Centuries of Type & Printing (Aperiodical LLC) and How Comics Are Made (Andrews McMeel Publishing).]


By Dan Moren for Tom's Guide

Apple’s Shortcuts app is getting a huge upgrade in iOS 26 and macOS 26 — here’s how it will help you

You may not know it, but there’s an app built right into your iPhone, iPad, and Mac that can supercharge your experience using all those products — and it’s about to get even more powerful.

That app is Shortcuts, and it lets you automate tasks on your devices, even across apps. You can have it do everything from a simple job of creating a GIF of a Live Photo to a complex system that lets you annotate a podcast as you’re recording it, and way, way more.

You build these workflows in the Shortcuts app by selecting actions and stringing them together: passing information from, say, the clipboard and having the system scan any text in the image, and then overlay it on an image you specify in order to create a meme. With this year’s updates to Apple’s platforms previewed at WWDC 2025, Shortcuts is able to do even more, including leveraging Apple Intelligence features and automating actions on the Mac.

That will make this automation tool even more versatile and able to handle more complex tasks than ever before—even if it still has some flaws that could stand addressing.

Continue reading on Tom's Guide ↦


By John Moltz

This Week in Apple: Let’s talk about something else

John Moltz and his conspiracy board. Art by Shafer Brown.

Apple gives them something to talk about, enhanced Siri is definitely maybe coming, and congratulations sickos, you can now look through windows on the iPad.

Your ass is Liquid Glass

Congratulations! We have something new to argue about other than the intersection of politics and tech or AI or App Store rules or all of the other things we’re so very tired of arguing about! Yes, almost as if the company wanted to change the subject, Apple announced a new look and feel to all its operating systems.

Turns out, some people hate it! And the beta is so buggy!

Yeah, that’s kinda how early betas work.

There is undeniably a mess of messy UI problems right now with Liquid Glass, but as someone who has been to more than one rodeo (two, I’ll have you know), I can give you the 411 on what’s going to happen: many of them will get fixed before iOS 26 ships and some… will not.…

This is a post limited to Six Colors members.


iPad podcasting and glassy design

This is surely one of the first podcasts recorded entirely using the local recording feature iPadOS 26. We discuss new OS stuff and design stuff. Big week!

Become a member (members, sign in) to listen to this podcast and get more benefits.



Apple’s newest OSes will support passkey import/export standard

Ars Technica’s Dan Goodin:

The import/export feature, which Apple demonstrated at this week’s Worldwide Developers Conference, will be available in the next major releases of iOS, macOS, iPadOS, and visionOS. It aims to solve one of the biggest shortcomings of passkeys as they have existed to date.

Yep, I’m back on the passkey beat! This is Apple’s implementation of the standard developed by the FIDO Alliance, which handles the specification for passkeys. The goal is to create a system more secure than just outputting a plaintext file full of your sensitive cryptographic keys and allow easy migration between password managers.

Passkey Export in macOS Tahoe
Nowhere to go.

On the macOS Tahoe beta running on my MacBook Air, I can start the export process in Passwords, which requires first re-authenticating with Touch ID. You can choose to export either a single item or all your items; in the latter case, you can’t export accounts created with Sign in with Apple or those that were shared to a group by someone else, and exporting will not delete the items from Passwords itself.1

In order to complete the export, you need to select an app to send it to, but as most password managers have not yet implemented the standard, I don’t currently have any options available. 1Password said last fall that it intends to adopt the standard; they’ll likely be joined by other apps, and I wouldn’t be surprised if support rolls out more broadly right around the time macOS Tahoe is released this fall.


  1. Standard passwords can of course still be exported as a file, though the app warns you that they’ll be unencrypted. 

By Joe Rosensteel

The new Spotlight for macOS 26 shows a path forward

Note: This story has not been updated for several years.

On Thursday there was a Six Colors Zoom call for Backstage-level members and contributors alike. Glenn Fleishman asked Jason Snell and Dan Moren about Spotlight. He wondered about the discoverability and the intuitiveness of some of these features. Jason mentioned that Apple views the features as power user features that don’t get in the way if you don’t know what they are. Dan said it would still be nice to have documentation of what all the features were, because it was difficult to know exactly what all the command functions are otherwise.

I piped in with my view that the real missing piece is natural language processing so people aren’t trying to discover commands or read documentation. We still need those other things, but to make this truly accessible we can’t expect everyone to memorize all the Quick Keys.

In March I wrote an opinion piece for Six Colors lamenting how text-to-Siri pales in comparison to typing a web search into your browser. I also compared text-to-Siri to Spotlight which handles searching better, but can’t process natural language requests. What I wrote in March is much broader in scope and encompasses requests like product knowledge.

Apple still isn’t doing any of that right now, but with App Intents and Quick Keys in Spotlight it’s creating the explicit command syntax that could be fed by something interpreting a natural language request.

Think of it like this: this year they’re writing grep, sed, pine, ffmpeg, etc. for Spotlight. A common issue for people is not knowing how to structure commands and turning to the web, and LLMs, to copy and paste arguments and flags for those powerful tools. They’re more accessible when people don’t have to figure out the flags and arguments themselves, but the explicit commands you pass them are still the foundation for what’s doing the actual file operations.

Jason said on the call that he thinks that this missing puzzle piece might be as early as next year, and that it seems like the next logical step. It certainly seems more achievable with a foundation like this laid.

Hopefully bozos like me aren’t writing blog posts in two years asking where it is while we ask LLMs to compose our Spotlight queries for us. I’m thinking positive thoughts, though.

[Joe Rosensteel is a VFX artist and writer based in Los Angeles.]


By Jason Snell

Apple gets over its hang-ups, and the iPad enters a new era

Is that a Mac? Nope, it’s iPadOS 26 running on a Studio Display.

Some of us have spent an awful lot of time pondering the iPad’s use cases as a professional productivity device. As a heavy user of the iPad, I’ve frequently wanted to push it into areas where it wasn’t designed to go because if I could get it to do what I wanted, it would fit in my life better than going back to the Mac.

As I pushed, the iPad pushed back. In recent years, I’ve come to accept that most of the time I travel, I have to bring both my iPad and my MacBook. I’m not alone in having reached the stage of acceptance.

But a funny thing happened this week: Apple seems to have changed direction, again, when it comes to more advanced uses of the iPad. In the early days, the iPad was clearly being groomed as the future of computing. In the middle ages, after Apple seemed to accept that the Mac wasn’t going to be eclipsed by the iPad, there seemingly remained a fear to let the iPad come too close to acting like a Mac.

We are in a new era now. Today’s Apple is not afraid to let the iPad run Mac-like windows, complete with stoplight buttons and Expose. In Cupertino this week, I got the strong sense that whatever dogma about not letting the iPad feel Mac-like has dropped away, replaced with an acceptance that the Mac is pretty great at a lot of things—and if the iPad is also great when it does those things, it should just do those things. It’s like a weight has been lifted from the soul of the iPad.

This is good news for advanced users who want to push the iPad to its fullest, of course. But it’s also a move that benefits Apple directly, because—if you haven’t noticed—the company is continually shipping very expensive iPad Pros powered by some incredible hardware, only for the reviews to keep mentioning that the hardware is let down by the less accomplished iPad software. I predict a little less kvetching about iPadOS when the next pricey iPad Pro model rolls around.

Checking the boxes

I’ve been writing about the iPad Pro since it arrived in 2015. I was about a year into doing Six Colors and podcasts as a living, and I was really intrigued by the idea of changing my productivity and breaking out of the laptop box. I wrote stories on my iPad. I edited podcasts on my iPad. I traveled with only my iPad.

But along the way, I built up a huge list of complaints about all the things that the iPad just couldn’t do, things that got in the way of me using it the way I wanted to. The latest version of that list, going into this week, was this:

  • Can’t record local microphone audio while on a VOIP call
  • Awkward multitasking and windowing
  • Limited support for global keyboard shortcuts
  • Better support for items running in the background
  • Clipboard manager
  • Improved Files interface for working with, well, files

I can’t say that Apple checked all the boxes, but after this week, I feel a lot more confident that those not checked this week may be checked in the near future.

I’m going to get the podcasting thing out of the way first. It’s such a niche need, but it’s a huge blocker from a workflow standpoint: you’re recording your podcast or video on a third-party app, but for quality reasons you want to also be recording your local audio and video so that they’re of the highest quality, as opposed to the versions that get compressed and sent over the Internet. It’s easy to do on a Mac, but impossible to do on an iPad… until now. (For the record, this feature also works on iOS 26, which means that podcasters could actually get by with just an iPhone and a USB microphone!)

iPad users will be able to opt for complexity or simplicity.

Next, multitasking and windowing. In earlier eras of the iPad, Apple reluctantly accepted multitasking by introducing Split View and Slide Over, and then later Stage Manager, which created a windowing system that was not Mac-like at all. Windows couldn’t be resized freely, or placed freely, or overlap other windows in the wrong way. But at some point, Apple decided to just throw out that entire system and build a new one that’s unabashedly inspired by the Mac. In iPadOS 26, you can resize windows arbitrarily, put them anywhere, and manage them using the familiar stoplight buttons in the top left corner. (It even supports keyboard shortcuts, so you can Globe-F to toggle full screen, or Globe-Shift-Left Arrow to automatically send a window to the left half of the screen.)

Related: The iPad has a Menu Bar now! This has been something Apple has been creeping toward for four years, since iPadOS 15, but it’s finally here. And you know what? Within an hour of using the iPadOS 26 developer beta, I ended up wondering how to perform an action in an app—and realized I could just look in the Menu Bar. The Menu Bar is one of the great innovations of the Mac, allowing an ordered way to browse through functionality and discover keyboard shortcuts, and why should the iPad be denied it just because it’s such an important part of the Mac? (And yes, Command-Shift-Question Mark will let you automatically search the menus.)

It’s kind of hard to believe that it’s been two years since Final Cut Pro for iPad shipped, answering once and for all the question “why are Apple’s biggest pro media apps not on the iPad Pro?” Unfortunately, it also just showed how far behind the iPad was: Once you kicked off a video export, you had to just sit there and watch the progress bar, because leaving the app would cause the export to fail. Again, iPadOS 26 to the rescue: There’s now a Live Activities-based interface for background tasks (available for all user-initiated tasks with clear end states, such as exports, renders, and file copies) that actually does the Mac one better by coalescing all the ongoing activity in one place. I should be able to leave Final Cut or Logic or Ferrite and move on to something else while the export takes place in the background, just like on my Mac.

There are also enormous improvements in the Files app, where the list view now features customizable columns and folders with expanding disclosure options. You can also control which app opens a file and, yes, even assign a default opening app, something Mac users take for granted that was just never there before in Files.

There’s no clipboard manager or support for global keyboard shortcuts yet, but even there, I’m optimistic. If macOS can gain a clipboard manager after 41 years via upgrades to Spotlight, it’s pretty easy to suppose that iPadOS might be getting similar functionality next year. That Spotlight upgrade in macOS also features a bunch of other power-user productivity boosts that would work well on the iPad, adding keyboard-based control power that might make my desire for global keyboard shortcuts less strong.

As a fan of the original iPad pointer, I’m sad to report that it’s been replaced by a new, Mac-inspired one. The reason the old one died is a pretty good one: it was meant to represent the touch target of iPad software designed for fingers, and Apple is now accepting that sometimes pro users want more precise pointer control than that. (Also, those new stoplight buttons are smaller than the old pointer circle!) I’ll miss the morphing cursor because I think it might’ve been the strongest example of the iPad rethinking and outdoing an old Mac idea, but the new pointer fits like a comfortable old shoe.

Easy or expert?

One of Apple’s greatest challenges is its own success. It’s got millions of users across a wide spectrum of demographics, geographies, and levels of expertise. How do you create a single product that can be what it needs to be for all of them? This can lead to discoverability problems for new features, overly complex interfaces for novices, and frustratingly simplified features for experts.

The iPad is the device where this struggle has been out in the open, though I’d argue it affects the iPhone and Mac just as much. On the iPad, though, the divide is pretty stark: A lot of people really never want to do anything but use one app at a time. They’re never pressuring the processor. They’re not connecting peripherals, even Apple-built ones. How do you give the people who want more what they want, without wrecking the experience for the much larger group who like it simple?

Apple’s taking another cut at this, and it seems to me that by following the Mac’s lead, they’re setting the iPad up for success. Nobody, not even power users like me, wants to see the simplicity of the basic iPad experience degraded in any way. I think they’ve done a pretty good job of adding pro features without breaking it for everyone else. We’ll see how it goes over the summer and into the fall.


by Jason Snell

20 years of ‘Stay Hungry, Stay Foolish’

The Steve Jobs Archive has remastered his classic Stanford commencement address for its 20th anniversary:

It’s not an obvious candidate for a classic. A commencement address by a college dropout. A talk aimed at 22-year-olds that warns “You will gradually become the old and be cleared away.” A text as shadowed by reality as soaring with inspiration: “Your time is limited, so don’t waste it living someone else’s life.”

There are also a remarkable set of emails that Jobs sent himself with notes about what he wanted to say in his speech, and a lot of details (like his nerves, and Apple PR’s attempts to suggest things for him to say) that I hadn’t heard before.

It remains one of the most remarkable speeches you will ever hear. And certainly one of the best things to ever happen at Stanford Stadium.1


by Jason Snell

Canal+ to debut Immersive doc produced with new Blackmagic camera

French media company Canal+ announced this week that it’s working on a documentary about French motorcycle racer Johann Zarco at the French Grand Prix, which would probably not be notable for most people were it not for how it was made:

Produced in collaboration with Apple and MotoGP, a competition organized by Dorna, this new documentary event is the first Apple Immersive Video production filmed entirely with the new Blackmagic URSA Cine Immersive camera. CANAL+ will be the first global studio to publish content in this exciting new storytelling format for Vision Pro.

MacBreak Weekly listeners/viewers will know that we’ve been discussing the potential for Immersive Video on the Vision Pro to expand rapidly once non-Apple filmmakers have access to Blackmagic’s new Immersive camera. This documentary is the first, but hopefully it is the beginning of a bigger trend.

I had a chance to watch the immersive teaser for this documentary, and it looked great, with a quality very much in line with other Immersive videos on the Vision Pro. I don’t know if I care a lot about the French Grand Prix, but I am a bit of a sucker for a good sports doc—and an Immersive one? Sign me up.

Canal+ says the documentary will be available in September.



Search Six Colors