Each image taken by Rubin’s camera consists of 3.2 billion pixels that may contain previously undiscovered asteroids, dwarf planets, supernovas and galaxies. And each pixel records one of 65,536 shades of gray. That’s 6.4 billion bytes of information in just one picture…. Rubin will capture about 1,000 images each night.
Although Rubin will take a thousand images a night, those are not what will be sent out into the world at first. Rather, the computers at SLAC will create small snapshots of what has changed compared with what the telescope saw previously… Just one image will contain about 10,000 highlighted changes. An alert will be generated for each change — some 10 million alerts a night.
Storing, transmitting, and disseminating that much data leads to some interesting problems, like having enough storage onsite in case of outages, stringing fiberoptic cable across the Atacama desert, and processing the images to provide manageable data for astronomers to access remotely.
Last week’s Six Colors podcast was recorded entirely on iPads running iPadOS 26, mine in California and Dan’s in Massachusetts. The podcast is usually just for Six Colors members, but you can listen to it here if you want.
You’ll be disappointed if you expect to hear anything special about it, though. We both recorded it on our usual Shure MV7 USB microphones, and it just doesn’t sound any different at all. (For the full iPad extravaganza, I should’ve edited it in Ferrite on my iPad, but for expediency’s sake, I didn’t at the time. I’ve since done that just for kicks, and that’s the image at the top of this story.)
It’s probably worth explaining why this feature has so many podcasters and other creators in a bit of a tizzy. Many podcasts record remotely, with people all over the world, and they usually use some sort of app to have that real-time conversation. It was Skype back in the day, and these days it’s often Zoom or a web-based recording program like Riverside. Because those apps prioritize real-time audio and video over quality, the quality is frequently bad by necessity.
To ensure that the very best audio and video is used in the final product, we tend to use a technique called a “multi-ender.” In addition to the lower-quality call that’s going on, we all record ourselves on our local device at full quality, and upload those files when we’re done. The result is a final product that isn’t plagued by the dropouts and other quirks of the call itself. I’ve had podcasts where one of my panelists was connected to us via a plain old phone line—but they recorded themselves locally and the finished product sounded completely pristine.
The problem has been iPadOS and iOS, which won’t let you run a videoconferencing app and simultaneously run a second app to capture your microphone and video locally. One app at a time is the rule, especially when it comes to using cameras and microphones. Individual iPhone and iPad videoconferencing apps can choose to build in local-recording features if they want, but in practice… they just don’t.
Apple has solved this in an interesting way. What it’s not doing is allowing multiple apps access to the microphone (so far as I can tell, I just tried it and the moment I started a FaceTime call, my local recording app stopped). Instead, Apple has just built in a system feature, found in Control Center, that will capture local audio and video when you’re on a call. It doesn’t work when another app is not currently using the microphone and camera, so it can’t be set to surreptitiously record stuff, and it displays a recording symbol at the top of the screen when it’s running. When you’re done, you can tap that symbol and it’ll save the file to the Files app.
The file it saves is marked as an mp4 file, but it’s really a container featuring two separate content streams: full-quality video saved in HEVC (H.265) format1, and lossless audio in the FLAC2 compression format. Regardless, I haven’t run into a single format conversion issue. My audio-sync automations on my Mac accept the file just fine, and Ferrite had no problem importing it, either. (The only quirk was that it captured audio at 24-bit, 48KHz and I generally work at 16-bit, 44.1KHz. I have no idea if that’s because of my microphone or because of the iPad, but it doesn’t really matter since converting sample rates and dithering bit depths is easy.)
Even in Developer Beta 1, this feature is pretty solid. What’s missing is a better preview of the audio levels and the ability to adjust audio gain, since different microphones have different gain levels and not all of them are easily adjustable. Beyond that, though, this feature is a winner. Podcasters should be rejoicing—I know I am.
Since we had video off for the Six Colors podcast, the video track was blank and took up no space. ↩
Why FLAC and not Apple’s own lossless format? My guess is that it’s being done for compatibility and simplicity reasons. ↩
My thanks to Turbulence Forecast for sponsoring Six Colors this week. Whether you want to keep your nerves in check, are flying with kids, or even want to know the best time to get out of your airline seat for a dash to the bathroom, Turbulence Forecast is the easiest way to know in advance just how smooth or bumpy your next flight is going to be.
This is a weather app of a different kind, charting your flight routes and sending you a detailed PDF with multiple maps, with updates pushed right to your phone. You’ll see exactly where it might get bumpy and where it should stay nice and smooth, up to five days ahead, with regular updates as the weather changes.
Turbulence Forecast can see out as far as five days, making it easier to pick a calmer day if your plans are flexible. They’re even tracking their forecast stats so you can see how steady their predictions have been over time.
And if you want a personal touch? Turbulence Forecast offers a Handcrafted Forecast — where the founder, who’s been doing this for 20 years, personally looks at your flight, answers your questions, and even suggests smoother options if you’ve got some wiggle room. (Bonus, he’s a huge Six Colors fan.)
This year, John Voorhees and I returned to the scene of the crime—the place where we got a demo in 2024 of Swift Assist, a feature that never shipped that we could’ve sworn we saw demoed live—to see the updated Xcode with AI assistance. Same room, same people, but this time the feature wasn’t just promised, it was shipping in Developer Beta 1.
More to the point, as John writes on MacStories, Apple had entirely rearchitected the tool so that developers can use any AI system they want and update to new models as they become available:
I’m not a developer, so I’m not going to review Swift Assist (a name that is conspicuously absent from Apple’s developer tool press release, by the way), but the changes are so substantial that the feature I was shown this year hardly resembles what I saw in 2024. Unlike last year’s demo, this version can revise multiple project files and includes support for multiple large language models, including OpenAI’s ChatGPT, which has been tuned to work with Swift and Xcode. Getting started with ChatGPT doesn’t require an OpenAI account, but developers can choose to use their account credentials from OpenAI or another provider, like Anthropic. Swift Assist also supports local model integration. If your chosen AI model takes you down a dead end, code changes can be rolled back incrementally at any time.
This is perhaps the best sign that Apple’s attitude toward AI and Apple’s role in the world has changed dramatically since 2024. If, in late 2025 or early 2026, a new coding model becomes all the rage with developers, Xcode will be able to use that model. That’s a big step forward for Apple.
Another Worldwide Developers Conference is in the books, and after a week of keynotes, briefings, and travel, I’ve finally had a chance to sit and zoom out to the 35,000-foot view of the company’s latest announcements.
The Apple of 2025 has definitely learned some lessons.
In hindsight, last year’s event has seemed even more rocky, with the company hustling to unveil Apple Intelligence, including showing off features that still have yet to ship. To its credit, it avoided doubling down on those mistakes with this year’s announcements without fully repudiating its previous steps. Instead, the company went back to focusing on the assets that make it the best at what it does. In other words, the ones that let Apple be Apple.
Note: This story has not been updated for several years.
They should have given it a theatrical run.
Apple has largely tied major revisions of tvOS to the launch of new Apple TV hardware over the years. Since the introduction of Apple TV+, WWDC’s tvOS “features” have largely focused on showcasing sizzle reels of Apple TV+ shows, and very little about tvOS itself. This WWDC gave us a trickle of announcements that don’t seem to align with what I would consider to be the rough spots in the tvOS user experience.
It is possible that Apple is holding back meaningful revisions until they launch an updated Apple TV box this fall. Maybe they’ll even mention the 10th anniversary of tvOS itself, which was unveiled in September of 2015 at the iPhone 6S launch event. Until then, I guess we should reflect on what’s announced, instead of wish lists of what could be.
Through a glass, darkly
I’m not going to rip into the design in beta 1. It’s also mostly a conservative evolution of what came before but with highlights on edges. However, Apple has really underscored a very specific part of the interface as working as intended and I will push back on that.
Apple has two kinds of Liquid Glass (Regular and Clear) and Clear is supposed to be used over rich media, like video. The only things that define the existence of the controls are the highlights and brighter/blurry refractions visible through the clear elements.
Well, gee whiz, aren’t clear glass playback controls going to be difficult to see over video, especially when it’s playing through the controls?
To make the controls easier to discern, Apple applies a dimming layer on everything around the controls, but not on the video visible through the controls. It’s like someone stenciled out aftermarket window tinting.
This is bananas. How is this getting out of the way of the content? You can barely discern the playback timeline and playhead while motion is occurring through the element, which causes it to pulse in a thin strip. What is being achieved here? The playback controls and timeline should be flat. No one is going to feel sad that there’s no glass effect in this one spot, where it serves no practical or artistic purpose other than being a wicked smart shader demo.
Poster through it
Another notable change in the interface is the pivot from horizontal thumbnails to portrait-orientation posters. Apple says that this means more tiles can fit on the screen, but that’s only more tiles visible in one row, and it’s only one additional tile over the smallest scale thumbnails (6 posters instead of 5 thumbnails). The older design had thumbnails that matched the aspect ratio of the TV in various sizes so you’d get more rows with fewer titles visible on screen in each row.
To compensate for this difference in aspect ratio, the text that was below or next to the thumbnails is now on top of them. I’ll let readers debate which is more legible, and whether or not the text is always helpful.
tvOS 18.5 (left) versus tvOS 26 beta (right).
This decision pushes content downward. If you want to see what kind of category you’re in the mood for, you will do more scrolling down, which means it will take you longer to count the number of times the TV app recommends you watch “Stick.” Unless you really want to flip through one particular row of the interface one title faster, it’s not really an improvement.
Used any good profiles lately?
I’m unclear about the continued push by Apple to get developers to adopt Apple’s user profile system. It really doesn’t provide any benefit to the developers of these large streaming services that need to have their own multi-platform profile systems with personalized content recommendations, and it doesn’t provide substantial benefit to households with shared viewing.
Someone had the forethought to include this button in beta 1.
I have no animosity towards user profile improvements whatsoever, and I do appreciate that on your first boot of tvOS 26 you can say you never want to see the profile switcher. However, system-level user profiles just don’t feel like the area of the TV viewing experience that needs this much attention when compared to other aspects.
If I were being generous, I could hypothesize that this emphasis on user profiles is because there will be some genuine effort put into personalizing the TV app based on the active user profile.
Unfortunately, you still can’t express any kind of preference in “personalized” areas of the interface to mark a recommended show as watched (without first adding each title to your Watch List and then marking it there) nor can you express that you have no interest in a title.
Even if increased personalization is on the horizon, there’s no reason to expect that to work as well as the personalization offered in each streaming app’s own recommendation systems. Such a thing requires developer participation and cooperation with Apple.
Speaking of developer participation…
Just keep adding single sign-ons until one of them works
The 10th anniversary of Single Sign-On is next year, so we’ll be celebrating this latest attempt a little early. That first attempt used a convoluted system to recognize your cable provider to authenticate all the individual apps you had that worked with existing subscriptions so you wouldn’t have to sign in. Just 18 months later Apple announced zero sign-on, where if you were on a qualifying provider’s internet network, the apps would authenticate on their own.
It’s safe to say that these systems almost immediately became obsolete because they were centered on a business relationship between customers and service providers that was in quick decline. Apple’s blind spot here was believing that anything not subscribed to via a cable provider would be subscribed to via Apple. Due to Apple’s App Store policies on subscriptions, many streamers have left the App Store behind. That means people have to do little sign-on dances that makes using Apple products as frustrating as cheap streaming hardware.
Instead of repairing its relationships with streamers, it’s providing this very latest sign-on feature, which links accounts via your Apple Account email address… but requires streamers to want to implement it. I hope they do, and I hope it works to make everyone happier.
Sing out loud, sing out long
I find myself scratching my head at the announcement regarding using iPhones as microphones to do Apple TV-mediated karaoke.
Look, this feature won’t hurt me, or cause harm to the world—with the possible exception of those within earshot—but it’s such a niche thing to do. I have to imagine that someone took a look at the collection of technologies that Apple had built and realized they could put them together, you know, for fun!
I hope people who use this feature do have fun. But it’s a strangely specific thing to use as a selling point, when there are other use cases for the Apple TV, such as watching television, that might be better places to focus.
Give me more
I want tvOS to improve, and am frustrated when another WWDC comes along and the changes are as minor as they were this year. I hold out some hope that there’s more to announce, and it’s being held back on for a new Apple TV hardware announcement. But for now, we’ve got tvOS 26… and it cuts down on information density and creates make see-through timelines.
tvOS needs to sort out the dichotomy between the home screen and the TV app. The current TV app is a mess and needs to be upgraded to support features that Apple has never taken a single pass at, like a universal live guide. I don’t expect them to be perfect, but it would be nice if we could see that Apple is making an effort. Change is long overdue for a platform that many take for granted. Apple needs to try harder at the TV part of tvOS.
[Joe Rosensteel is a VFX artist and writer based in Los Angeles.]
Our interest in drones and how we’d use them, household standards for wearing Bluetooth headphones, whether we install OS betas and on which devices, and the rare app or service where we don’t mind seeing ads.
Federico Viticci of MacStories got to sit down with Apple’s Craig Federighi to discuss iPadOS last week, which is a tremendous thing to see, and you can read all about it:
After all the talk over the past 15 years about the “post-PC era”, why have we come full circle to reusing features and UI metaphors that the Mac got right decades ago? I ask Federighi about this. “When you’re designing in a new space with a new set of constraints with a different kind of user in mind, you do guard yourself against whether it would be too easy to just pull the old thing off the shelf and put it here because maybe that feels right, because we’ve lived with it since 1984”, he begins, acknowledging the Mac’s key role in democratizing graphical user interfaces and freeform windowing. “And you ask yourself”, Federighi continues “’Well, but what is the essence of iPad? And if that other world had never existed and one had designed from first principles for a touch-first device…what would a cursor be like? What would windowing be like?’”…
“At the same time, you have to not be allergic to learning from the past”, Federighi adds. “I think the balance we’ve landed on now is saying, ‘Listen, in the case that the right answer for iPad is a consistent one with another device, the Mac, then, of course, let’s use it. But let’s not reach for something on the Mac reflexively, just because it’s there’”.
The Federighi quotes are interesting and Viticci adds a lot of useful context, but don’t miss the fact that this article is also Viticci’s in-depth post-WWDC brain dump of his reaction to the massive changes in iPadOS 26. It’s basically two articles in one, either of which would be a must-read.
A couple of years ago, I recalled that in the early days of Mac OS X, I built up an entire array of utilities that allowed me to use my Mac just how I wanted it. I felt utterly naked on a Mac without LaunchBar, for example. But in the intervening two decades since OS X’s early days, Apple has just kept improving the base features of macOS to the point where most of my old “must-have Mac utilities” had become ones I kept around more out of habit than necessity. And in some cases, I’ve stopped using old favorites entirely because Apple’s built-in tools did the job. That’s good, because a new Mac user shouldn’t need to install a half-dozen utilities to get about being productive.
Many apps can act as clipboard managers—I’ve been using the one in LaunchBar for years, and Pastebot is a popular favorite—and once you use a clipboard manager, it’s hard to go back to Apple’s concept, unchanged in nearly 40 years, that there’s a single clipboard and once you copy something new, the old clipboard is gone forever. I now reflexively copy multiple items in one app and then paste those items into a different app rather than doing the old back-and-forth. I rely on the clipboard history to dig out an item from half an hour ago without having to look it up again…
Another area of interest is file management and automation. I recently wrote about how Folder Actions is somehow still a thing in macOS. Think about offering users the ability to select a folder in Finder or Files and build actions that would occur when those folders changed. Folder Actions enabled some of that, and utilities like Hazel have taken it to the extreme. Sure, power users can run wild with features like this, but I think regular users might appreciate being able to say, “When a file in this folder is older than 60 days, file it away somewhere else,” or “Delete all the disk image files in my downloads folder older than 60 days.” There’s something there.
In macOS 26, there’s a built-in clipboard manager that can be accessed from the Spotlight interface, and a new set of Shortcuts triggers let you run automations when events occur on your Mac or at specific intervals.
I’m sure there are still corners of macOS that could benefit from new features from Apple—there are always new frontiers—but I’m struck by the fact that two of the most glaring areas for improvement have been directly addressed in macOS 26 Tahoe. I can’t wait to spend more time with it in beta this summer.
It’s still early days for these technologies, but I’m here to tell you that their speed alone is a game changer for anyone who uses voice transcription to create text from lectures, podcasts, YouTube videos, and more. That’s something I do multiple times every week for AppStories, NPC, and Unwind, generating transcripts that I upload to YouTube because the site’s built-in transcription isn’t very good.
I’ve been using OpenAI’s open-source Whisper system (mosty whisper.cpp) for a couple of years, and while it seems to be more accurate than Apple’s model, it’s also half the speed of the large-v3-turbo model I’ve defaulted to lately.
It’s great to see that Apple is in this game, and even better, that it’s handing the power of this model to app developers so they can built speech-to-text transcription features directly into their apps.
Zaz splits WBD, the neverending Paramount sale, listener letters, and TV picks! [Downstream+ subscribers also get: Apple TV stasis, Clooney on CNN, and the Great Netflix User Migration.]
The Summer of Fun begins with loads of WWDC follow-up, including some big-picture reflections on last week and a discussion of some of our favorite features in Apple’s beta OS releases.
iCloud Photos requires mysterious files and processes and often consumes huge amounts of storage space. Six Colors subscriber John writes in with a question about one aspect of this:
I have photos set to optimize storage, and it’s currently (according to DaisyDisk) using 60GB of my 500GB MacBook Air.…But mediaanalysisd is also using 60GB – I understand that’s one of the processes that run when idle, but should it be using that much space, and can I restrict it?
The tl;dr answer is: make sure you’re running macOS Sequoia 15.3 or later. A bug in 15.2 apparently caused this problem for many people. If you’re already on 15.3 or later, I provide more instructions at the end of the article for deleting the cache and the consequences.
Let’s break down how iCloud Photos manages your storage, particularly when you don’t want to store full-resolution images and videos on your Mac.
All those moments will be lost in time, like tears in rain
Photos stores nearly all its data inside a macOS package named Photos Library by default, with a .photoslibrary extension. This package contains a number of folders that allow Photos to perform tasks like retaining your original image and recording modifications that can be reverted later. This structure isn’t designed for humans, but as an efficient way for the app to manage, display, search, and organize media.
If you enable iCloud Photos in Photos > Settings > iCloud, you have two choices presented:
Download Originals to this Mac
Optimize Mac storage
You can choose whether to store full-resolution or optimized media on your Mac with Photos iCloud settings.
(Please ignore Apple’s capitalization. It drives me bonkers.)
I have advocated for a long time for people to devote enough storage on their Mac—or one of their Macs if they have multiple—to store the entire downloaded Photos library. Because iCloud storage of Photos is a black box, there’s no good way to interact with your files backed up there. Without owning a full-resolution local copy that you can archive and update via Time Machine, another local backup option, or cloud backup (and preferably a combination of those), you could find yourself reliant on the iCloud copy.
While Apple has been rock solid with iCloud storage for years, you can have other problems with relying on it: you could be locked out of your account, because of a reason Apple won’tinformyouof. Or, you could have a catastrophic set of system failures or equipment loss—such as in a fire or natural disaster—or even an issue with password and security code record keeping that leaves you unable to prove ownership. If someone hijacked your account, they can also delete your media using iCloud.com or through a locally synced copy, although that’s less likely.
With my warnings noted, optimizing iCloud Photos storage can be quite effective when you have an enormous library relative to your local storage. I have a 2 TB SSD attached to a Mac Studio to keep my 800 GB Photos Library from filling the internal 1 TB drive. However, I have optimization enabled on my MacBook Pro, because that laptop’s 1 TB internal drive means I lack the storage to keep the whole thing. Question-asker John’s library occupies about 1 TB on iCloud.
Let’s dig into optimization.
Uncontrollable purging
Photos optimizes storage by retaining a thumbnail and other metadata about an image or video but dumping the full-resolution media file, which is retained in iCloud. If you double-click, edit, export, or otherwise preview an image or video, the file is downloaded, offering a circular progress completion graphic in the lower-right corner to let you know something is happening. On sufficiently fast networks, you rarely have to wait except for large videos.
Apple uses optimization in a few places, including iCloud Drive (System Settings > iCloud > iCloud Drive), and uses the same philosophy in each case. macOS has some background monitoring to avoid your drive filling up to 100% and rendering it unusable. (This doesn’t always work.)
When some unspecified threshold is reached, various background daemons that are set to optimize can kick in. With iCloud Drive, the least-used files are dumped from local storage first.
With iCloud Photos, however, Apple appears to be extremely aggressive. I can’t think of a single case in which I’ve had optimization enabled in which the optimized Photos Library is more than a range of 10% to 20% of the full library size. On the above-mentioned MacBook Pro, my Photos Library is under 40 GB of local storage.
You can see the storage that iCloud says your synced Photos Library takes up as part of your iCloud or iCloud+ storage.
(Photos and iCloud are often very erratic about reporting storage and quantities of media. For instance, while the Photos Library is about 800 GB with full-resolution downloads configured on my Mac Studio, iCloud reports it takes up 670 GB online. I’m unclear what extra material makes up a 130 GB difference.)
Having that data in mind, we can circle back to John’s primary question—I am a master of expositional sidetracking—about why the daemon mediaanalysisd and its associated storage are sucking up so much data.
The mediaanalysisd agent’s job is to process images in the background during low-activity periods to perform facial (person and pet) and object recognition. (Apple notes, “Face recognition and scene and object detection are done completely on your device rather than in the cloud.”)
You can see where this daemon stores its thinking by going to the Finder, choose Go > Go To Folder, and enter: ~/Library/Containers/com.apple.mediaanalysisd/ Data/Library/Caches/com.apple.mediaanalysisd
On my full-resolution-storing Mac Studio, the folder there takes up 205 MB; on the MacBook Air with optimization on, a bit more at 386 MB. As I mentioned at the outset, people have reported for a few releases that this file will grow uncontrollably, but Sequoia 15.3 and later appear to have stanched that.
You cannot turn this agent off without going through some command-line hoops, many of which are reversed when you install the next macOS update. However, because this is a cache, you can opt to delete it if the storage is causing problems.
Causing amnesia through file deletion
If your com.apple.mediaanalysisd folder is bulging like a failed lithium-ion battery, here’s what you can do:
Quit Photos.
Drag com.apple.mediaanalysisd to the Trash.
Empty the Trash.
Restart your Mac.
Deletion may cause Photos to start over with analyzing people, pets, and things. It might result in the folder swelling back to its original size! But given the reports online from people starting in Ventura, and particularly with people after upgrading to macOS 15.3, this is the best path forward.
For further reading
Our very own chief, Jason Snell, has the definitive book on the topic, Take Control of Photos, where you can find oodles of information and step-by-step instructions on working with the enormous beast that is Photos.
[Got a question for the column? You can email glenn@sixcolors.com or use/glennin our subscriber-only Discord community.]
You may not know it, but there’s an app built right into your iPhone, iPad, and Mac that can supercharge your experience using all those products — and it’s about to get even more powerful.
That app is Shortcuts, and it lets you automate tasks on your devices, even across apps. You can have it do everything from a simple job of creating a GIF of a Live Photo to a complex system that lets you annotate a podcast as you’re recording it, and way, way more.
You build these workflows in the Shortcuts app by selecting actions and stringing them together: passing information from, say, the clipboard and having the system scan any text in the image, and then overlay it on an image you specify in order to create a meme. With this year’s updates to Apple’s platforms previewed at WWDC 2025, Shortcuts is able to do even more, including leveraging Apple Intelligence features and automating actions on the Mac.
That will make this automation tool even more versatile and able to handle more complex tasks than ever before—even if it still has some flaws that could stand addressing.
Apple gives them something to talk about, enhanced Siri is definitely maybe coming, and congratulations sickos, you can now look through windows on the iPad.
Your ass is Liquid Glass
Congratulations! We have something new to argue about other than the intersection of politics and tech or AI or App Store rules or all of the other things we’re so very tired of arguing about! Yes, almost as if the company wanted to change the subject, Apple announced a new look and feel to all its operating systems.
Turns out, some people hate it! And the beta is so buggy!
Yeah, that’s kinda how early betas work.
There is undeniably a mess of messy UI problems right now with Liquid Glass, but as someone who has been to more than one rodeo (two, I’ll have you know), I can give you the 411 on what’s going to happen: many of them will get fixed before iOS 26 ships and some… will not.…
This is surely one of the first podcasts recorded entirely using the local recording feature iPadOS 26. We discuss new OS stuff and design stuff. Big week!
With over 5,000 five star reviews; Magic Lasso Adblock is simply the best Safari ad blocker for your iPhone, iPad and Mac.
As an efficient, high performance and native Safari ad blocker, Magic Lasso blocks all intrusive ads, trackers and annoyances – delivering a faster, cleaner and more secure web browsing experience.
The import/export feature, which Apple demonstrated at this week’s Worldwide Developers Conference, will be available in the next major releases of iOS, macOS, iPadOS, and visionOS. It aims to solve one of the biggest shortcomings of passkeys as they have existed to date.
Yep, I’m back on the passkey beat! This is Apple’s implementation of the standard developed by the FIDO Alliance, which handles the specification for passkeys. The goal is to create a system more secure than just outputting a plaintext file full of your sensitive cryptographic keys and allow easy migration between password managers.
Nowhere to go.
On the macOS Tahoe beta running on my MacBook Air, I can start the export process in Passwords, which requires first re-authenticating with Touch ID. You can choose to export either a single item or all your items; in the latter case, you can’t export accounts created with Sign in with Apple or those that were shared to a group by someone else, and exporting will not delete the items from Passwords itself.1
In order to complete the export, you need to select an app to send it to, but as most password managers have not yet implemented the standard, I don’t currently have any options available. 1Password said last fall that it intends to adopt the standard; they’ll likely be joined by other apps, and I wouldn’t be surprised if support rolls out more broadly right around the time macOS Tahoe is released this fall.
Standard passwords can of course still be exported as a file, though the app warns you that they’ll be unencrypted. ↩