Left to right: HOMERUN (not part of season 2), Dig Dig Dino, Fulcrum Defender.
Last week, Panic rolled out season 2 of games for its Playdate game handheld. I use my Playdate in bursts and then go a while without playing it, but I’m back in now that there are two new games being released every week for six weeks:
If you’ve pre-ordered the new season already, just grab your Playdate, head to Settings > Games, and download the two new titles. Check them out, and see what your Playdate-owning friends think about them. And if you haven’t pre-ordered Season Two yet, no problem—just buy it now and jump right in. You won’t miss anything.
The season is $39—that’s for twelve different games, plus some surprises, delivered delightfully. It felt like a no-brainer to me. I’ve been playing the first week’s games all week, and they’re great. Dig Dig Dino is a clicky no-stress game where you dig up weird dinosaur bones and find gems and stuff. Fulcrum Defender is a space shoot-em-up where you sit at the center of screen and use the Playdate’s crank to rotate around and shoot at enemies while trying to stay alive for 10 minutes, and despite all that it’s actually kind of relaxing? And two new games, The Whiteout and Wheelsprung, just showed up.
I should also mention that there are now hundreds of games available for the Playdate, between Panic’s own Catalog store and stuff you can just get on the web and sideload. Lately I’ve been really enjoying the $5 HOMERUN, in which the mechanic of using the Playdate’s unique crank input to swing a baseball bat just keeps iterating in funny and clever ways as a narrative story unfolds in the background. It’s just so fun and clever.
After years of the Playdate itself being incredibly backordered, Panic has apparently made enough because they’re in stock now for $229. It’s adorable hardware (but unfortunately not backlit, so you’ll need to play in a well-lit space) and the number of different games that have been released for it means there’s probably a load of them that you might enjoy. (My all-time favorite remains Season 1’s Hyper Meteor.)
While the rest of the world has already ushered in the new year, in the Apple world the year starts on Monday of WWDC week, when Apple opens its annual Worldwide Developers Conference and sets its agenda for the next year. Get the champagne and fireworks ready, because next Monday, the great cycle of Apple begins again.
Whether repairability affects our buying choices, what we’d want from a rumored HomePod with a screen, what we’re looking forward to at WWDC next week, and our thoughts on the Nintendo Switch 2 and whether standalone handhelds appeal to us.
I’m far more optimistic than I was after WWDC 2024. I don’t expect AI to replace our friends in the indie developer community; far from it. That’s because what sets a great app apart from the pack on the App Store is the care and humanity that’s poured into it. I’ve yet to see a vibe-coded app that comes anywhere close. Those apps will simply join the vast sea of mediocrity that has always made up a big part of the App Store. Instead, I expect AI will help solo developers and small teams tackle bigger problems that were once the exclusive domain of bigger teams with more resources.
I share John’s optimism and hope Apple gives app developers even more powers next week. (Dan and I will also be in Cupertino, and like John and Federico, we are looking forward to seeing old friends and meeting new ones!)
One of the features I’m hoping will be introduced next week at WWDC is Apple giving app developers access to its AI models. Yes, Apple’s models need to improve, and hopefully we’ll hear something about that. But right now, only Apple really gets to use them (or package them up in features like Writing Tools and let apps access them in a one-size-fits-all way).
If app developers were to get full access to the models, though, it allows them to get creative in applying AI features inside their individual apps. Yes, app developers can add AI functionality to their apps today, but it would be a lot easier and more economical if they could rely on an Apple-approved set of models that run entirely for free.
There are a lot of possibilities. I keep dreaming about one that would allow the authors of podcast apps to use on-device transcription models to generate podcast transcripts on device and then upload the result to build a shared cloud database of transcripts to compete with the cloud-based catalogs of transcripts built by deep-pocketed companies like Apple. Social-media apps could automatically generate image descriptions for uploaded images.
Another benefit would be the ability for apps to quickly generate AI summaries. I know there are limits to AI summaries—just ask the BBC—but there are a lot of interface elements that could be helped by a quick one-line summary of content.
What Apple should do instead is make its models — both local and in Private Cloud Compute — fully accessible to developers to make whatever they want. Don’t limit them to cutesy-yet-annoying frameworks like Genmoji or sanitized-yet-buggy image generators, and don’t assume that the only entity that can create something compelling using developer data is the developer of Siri; instead return to the romanticism of platforms: enabling users and developers to make things completely unforeseen. This is something only Apple could do, and, frankly, it’s something the entire AI industry needs.
Six Colors subscriber Ampsonic—an excellent source of questions—asks:
Why do some names turn red in Messages when a non-iOS user is added?
It’s all about hegemony! Ok, not quite, but it is about the tricky issue of how Apple deals with email addresses connected or not connected to iCloud accounts.
Outside looking in
When you enter an email address or type one into the address field in Messages, the software does a quick behind-the-scenes check. Messages tries to determine if the address is associated with an active iCloud account. As you accept or paste in an address, you might notice that it always briefly lights up black text within a blue lozenge. If the address isn’t connected, the background changes from blue to red. This sometimes happens so quickly you don’t see it move from blue to red.
The lifecycle of an address (from top): typing in (black text), initial confusion (red lozenge), matching iCloud (blue lozenge), but turns out it’s not really connected (green text).
Red doesn’t mean you can’t send a message to someone. Rather, it reveals that iMessage can’t be used. If you’re adding addresses to a group conversation, you might notice that if you start with a contact or an email address with iMessage active, these all appear as black-on-blue. The moment you add someone who doesn’t have a valid iMessage account, all the addresses switch to black on red.
There’s a variant, too: if you choose a contact with only a phone number connected or enter a phone number, and that number isn’t connected to an iCloud account, that entry appears as black on green.
This color confusion is even more peculiar because Apple can show you how it’s checking live. Start typing a name, phone number, or email address into the To field, and as autocomplete shows matches, it color codes the text: gray for not yet determined, blue for iCloud, and green for SMS. The grays usually change within a second or two to blue or green.
The moment you click in the message field, the red switches to the appropriate color, too. It turns to green text (no lozenge), and the field shows Text Message • SMS or Text Message • RCS. (RCS is the standard Google uses, and is enabled by default starting with iOS 18 and related operating systems when they use your iPhone for text messaging. The text “SMS” always appears whether or not the rich multimedia MMS option is available.)
As shown in the lifecycle figure above, sometimes you will see a blue lozenge for a match—cached? preliminary?—but once you click in the message field, and shifts to green text!
Better red than dead
More confusingly, even if you type in a non-iCloud-connected email address and the contact with which the address is associated also has an iCloud-hooked-up address, Messages forces the use of the iCloud-linked address! I haven’t found a way to force use of a non-iCloud address without removing that email from the contact card.
When you start typing in the To field, autocomplete shows matching contacts while Messages checks on their iCloud-connected status.
There’s one edge case you might encounter: If you enable Screen Time for yourself for communicating with others (Settings > Screen Time > Communication Limits and either During Screen Time or During Downtime), recipients with whom you cannot message at the moment will appear in red. You can override Screen Time to bypass that self-limitation.
The upshot is that red addresses are a gap in Apple’s color-coding schema. Ideally, unless there’s a legitimate problem with the addressee, Apple should use the information it already has to show a green lozenge.
For further perusal
Did you know I wrote a desperately long book that can help? Take Control of FaceTime and Messages (also covering Phone and telephony) is quite lengthy because Apple under-documents many features found in this flagship app, leaving that job to me.
I found frustrating just the sheer number of things that lacked information at support.apple.com or which had “drug interaction”-like problems where features conflicted with each other. (I didn’t take out my frustration on the reader!) If you’ve ever been baffled by how to get something down in any of those apps, I expect I have covered it in the book. If not, tell me what’s missing!
[Got a question for the column? You can email glenn@sixcolors.com or use/glennin our subscriber-only Discord community.]
Last week I got an advance peek at D-Day: The Camera Soldier, a $5 interactive documentary (delivered via the App Store) for the Vision Pro from Time Studios and TARGO. It’s the story of Richard Taylor, an American who stormed the beaches of Normandy armed with nothing but his own cameras, risking his life to document the momentous Allied landing on mainland Europe.
It’s a fascinating combination of different storytelling techniques fused together. There’s 3-D video footage of Taylor’s daughter returning to where her father experienced D-Day; there are 3-D objects such as Taylor’s camera and dog tags, as well as photos and letters; and there are some immersive locations based on Taylor’s photographs (including on and just off Omaha Beach).
Directed by Chloé Rochereuil, the app brought back to me the very best vibes of the old days of multimedia CD-ROMs. It’s an unapologetic mixed-media documentary that tells the story of the man who helped tell the story of all the other men on those beaches on D-Day. I’m not sure I’ve experienced as affecting an immersive environment on the Vision Pro as being in the landing craft with the gate down and the beach looming in the distance, 81 years ago.
In previous years, Apple would announce finalists for its annual Apple Design Awards and then roll out the winners at WWDC. There used to be a whole ceremony, back in the day. But now Apple has compressed the entire process, and on Tuesday it announced both the finalists and the winners all at once.
Honorees were chosen across the categories Delight and Fun, Inclusivity, Innovation, Interaction, Social Impact, and Visuals and Graphics, and include addictive card game Balatro (a winner), Panic’s charming game Thank Goodness You’re Here, and (a double nominee) the amazing visionOS tower defense game Gears & Goo.
It’s time for our 10th annual competition regarding what will happen at Apple’s WWDC keynote! What will be announced? Will there be a major redesign? What will the AI story be? We predict it all!
Six Colors reader Len Dintzer wrote in with an Apple Watch daily tracking question—and then the answer!
Here’s my problem: I have a Move Streak message saying I interrupted my streak and a Move Activity record that shows that I did not miss a day.…
He included two screenshots:
The first is from my Apple Watch, which says my Move Streak ended on 4/29/25, but that I have a current store of 32 days. Problem 1: Today is May 20, and it hasn’t been 32 days since April 29. Problem 2: I haven’t missed any days.
There’s no reason for this to happen and I couldn’t find an answer in my device research or online. Fortunately, Len consulted a higher authority and received an accurate set of directions to fix, which I’m sharing here with some additional detail for how to carry out each task. Thanks, Len!
This Apple Watch screen capture shows that Len’s Move Streak is broken—that’s not the right date count based on the date shown.
Reset a broken work streak
Apple told Len to follow these steps:
Perform an iCloud backup on the iPhone.
Go to Settings > Account Name > iCloud.
Tap iCloud Backup.
Tap Backup Now and wait for it to complete.
Turn off Health Data on the iPhone.
Go to Settings > Account Name > See All > Health.
Turn off “Sync this iPhone.”
Tap Turn Off.
When prompted with “Keep or Delete Health Data?”, tap Keep on My iPhone. This is vital!
Power down your iPhone and your Apple Watch.
Start up your iPhone and your Apple Watch.
Apple apparently omitted the final step, but make sure to return to Settings > Account Name > See All > Health on your iPhone and turn “Sync this iPhone” back on.
Len said this resolved the streak problem just as promised.
For further reading
If you’re looking for more information about how iCloud works, consult Joe Kissell’s extensive Take Control of iCloud, updated earlier this month.
For more general information about Apple Watch, take a look at Jeff Carlson’s Take Control of Apple Watch, up to date with watchOS 11.
[Got a question for the column? You can email glenn@sixcolors.com or use/glennin our subscriber-only Discord community.]
Jason Snell has been covering Apple since all the Macs it shipped were beige boxes. This week, he joins Stephen and David to discuss the company’s range of legal and technological issues that seem to be adding up rapidly.
Yes, Apple is hurtling over the integers from 19 to 25 in favor of having release numbers match the last two digits of the year. It’s only been a quarter of a century and we have already forgotten the lessons of the Y2K bug. Sucks to be you, iOS users 75 years from now.
This is, of course, a bit of a hassle for developers, but in the long run it makes some sense. Also, it’d be nice if they went back to animal names instead of places in California but I’m not holding my breath.…
This week I got an advance peek at “Bono: Stories of Surrender,” which is out today (it’s a beautiful day) on Apple TV+. It’s a movie version of the U2 frontman’s one-man show based on his book of the same name1. It’s available in a standard TV format, but also as an immersive video on the Vision Pro, which is what I watched.
I’ve been a U2 fan since “The Joshua Tree,” which dropped at exactly the right moment in my high school days, so of course I loved the content. Bono tells a version of his life story, occasionally breaking into short versions of his songs, backed by a mostly-string trio. It’s shot in black and white, but embellished with bright white-and-yellow animated drawings. I thought the presentation was quite effective, enhancing already-excellent live-action stagecraft.
(Chairs on stage represent the members of U2; a pair of armchairs and a table represent a place at the pub favored by Bono’s father. But his father appears in the chair as an animated line drawing. It’s clever and affecting.)
At 85 minutes, “Stories of Surrender” is also the longest immersive video Apple has ever released. To be clear, though, this isn’t 85 minutes of immersive video; it’s 85 minutes of video, with maybe 15 or 20 minutes of it containing immersive stage or concert footage. Several musical numbers are shot in immersive, and they’re great. The switch between immersive footage and a standard widescreen flat image wasn’t too jarring, either—it reminded me of seeing a film with “selected scenes in IMAX”, where the frame size changes, then changes back. It’s noticeable, but didn’t break me out of the experience.
I also really admire the work the filmmakers did in creating immersive versions of a lot of the animated white-and-yellow annotations that are the hallmark of the film. At several points, the annotations spring out of the traditional movie frame, or appear in front of the frame, or even mingle with items in the frame. It felt like a creative solution to the issue of being unable to shoot the entire film in a fully immersive setting, while still offering Vision Pro viewers something more spectacular than the standard version.
I have to admit, I’m still a little frustrated by Apple’s pace here. The company’s mysterious ways make it feel like it’s running to stand still, having not presented an immersive video that offers sustained immersive content at greater length. Nine minutes into “Stories of Surrender,” the credits start rolling—the opening credits. But Apple’s immersive content has so trained me to expect only bite-sized chunks, I legitimately thought for a moment that they were the end credits.
Clearly, producing this stuff is technically difficult, but this film just makes me want to see more. More immersive video, a full immersive concert experience, a full immersive theatrical experience, an immersive sports experience—something that’s even better than the real thing. I have a burning desire for that ultimate, long-form immersive video, and while “Stories of Surrender” is excellent, I still haven’t found what I’m looking for. I hope I don’t have to wait until the end of the world.
As well as a sort of homecoming, since Bono and other members of U2 have performed live at two different Apple Events. ↩
With over 5,000 five star reviews; Magic Lasso Adblock is simply the best Safari ad blocker for your iPhone, iPad and Mac.
As an efficient, high performance and native Safari ad blocker, Magic Lasso blocks all intrusive ads, trackers and annoyances – delivering a faster, cleaner and more secure web browsing experience.
About four weeks out from a manuscript deadline—already a month or two behind schedule—I broke my arm.
Well, technically, someone else broke it. A Muay Thai fighter, during a sparring session in a martial arts class I definitely should not have been in. The next morning, I had to call my editor and fess up: instead of hammering away at the keyboard, I’d been getting my forearm snapped like a dry twig. Now I wasn’t going to be hammering away at anything except painkillers and regret.
Did he have any suggestions?
After some colorful swearing—we’re both Australian—it turned out he did. Half an hour later, I’d bought a copy of Dragon Dictate, or whatever it was called back then. It’s gone through a few names and versions over the years. But back then, it was my only option.
I don’t think I would’ve stuck with it if I hadn’t had 120,000 words due and no other way to deliver them. The learning curve was steep. And the Mac-flavored software which I was using was legitimately regarded as inferior to the Windows version. But I was desperate. So I persisted.
And I finished the book.
Surprisingly, I even got a bit of a productivity bump out of it.1 Even with all the bugs and weirdness of early speech recognition software, it was still better than my typing, and I stuck with it.
I’ve been using dictation software in one form or another for well over a decade now. I’ve seen a lot of development, not all of it good. For a while there, I dreaded the release of new Dragon updates, because they always seemed to break more than they fixed.
I also found that while Dragon was great for fiction, where you can get into a storytelling flow, it was less helpful for writing magazine features (back when magazines were a thing that existed). Maybe magazine copy demands more precision. My agent tells a great story about Robert E. Howard, Conan the Barbarian’s creator, standing at a mantlepiece over the fireplace in his home, where he’d rigged up an early standing desk arrangement, roaring and gesticulating as he told himself the story while she typed it up.
That’s how I dictate novels. It’s fun, and good cardio, but it doesn’t work so well when you’re trying to finely craft a heartbreaking work of narrative genius for the super-picky subeditors at The New Yorker.
Anyway, long story short, I’ve always been a dictation nerd. Constantly hunting down new software. Always hoping that the next version will shave a few more seconds off the process or give me a bit more accuracy.
Recently, I switched from Dragon, which had been baked into Microsoft Word, to MacWhisper Pro, an LLM-based app for macOS. I was already trying out a writing experiment, switching from apocalyptic novels to, er, spy romances, so I figured it was a good time to try some experimental dictation, too.
I was stunned by the results.
From a roar to a Whisper
MacWhisper Pro
AI-powered dictation—at least for me—has turned out to be significantly faster and more accurate than even the best, most expensive versions of the previous generation of software.
I think of it as taking a leap from a Mechanical Turk to a probability engine.
Older systems like early versions of Dragon Dictate relied on pattern matching and statistical models like Hidden Markov Models (HMMs). You had to train the software to your specific voice, accent, and vocabulary. Over time, it would “learn” your patterns and improve.
But the actual recognition process was pretty rigid: matching sound waves to a limited set of templates, then mapping those to words using your trained vocabulary. These systems did run some probability calculations, trying to work out the most likely sequence of words from the sounds you made, but their contextual awareness was limited. They focused on individual words or short phrases. Not the broader meaning of the sounds.
Modern AI-powered tools like Whisper (the engine behind MacWhisper Pro) are a different beast. They use large neural networks—often Transformer architectures—trained on hundreds of thousands of hours of diverse audio and text. (Some of it stolen from me. You’re welcome.) They don’t need training like Dragon did. They just work, straight out of the box, for a wide range of accents, languages, and speaking styles.
As best I understand it, these models predict the most likely sequence of words from the audio input, based on the sound, but also the context of the whole sentence or even the paragraph. That allows them to handle ambiguity, background noise, and weird phrasing far better than older systems ever could.
They’re not just listening—they’re calculating. Continuously.
Which is why I think of the old systems as Mechanical Turks. They were rule-bound, brittle, and prone to failure outside their narrow training. Today’s probability engines use immense context to serve up the next most likely word. And no, it’s still not “thinking,” but the change in how it feels to use is dramatic.
For me, there are two immediate and critical differences between the old and new systems.
First, when I was writing using Dragon, I had to dictate everything—including the punctuation. Over the course of a 100,000-word novel, that adds up to thousands, maybe tens of thousands, of spoken commands: quotation marks, question marks, line breaks, paragraph breaks, ellipses, em dashes, and so on.
Every one of those was an opportunity for the software to mishear what I said and transcribe something else. A lot of the time I saved not hunting and pecking at the keyboard, I lost fixing the errors the program introduced by misinterpreting spoken punctuation.
Second, and even more powerful, is the freedom to correct yourself as you speak and just let the AI clean it up. It’s not unusual for me to get halfway through a sentence, realize I’ve butchered it, and say something like: “Ugh, that’s terrible—delete that, let’s try again.” Then I start over just like I would if I were dictating to a human taking notes.
Between those two improvements—out of what are probably a dozen major UX differences between the old-school dictation models and these newer, Whisper-based tools—the boost to my daily productivity has been astounding. What Tim Cook would call ‘blowaway.’
I used to aim for 1,500 words a day (the Antony Johnston-approved benchmark for a solid writing day), and by the end of it, I’d be wiped out. With these new tools, I regularly hit between 4,000 and 5,000 words daily, and the cognitive load feels much lighter.
One obvious sign of this: I’m taking fewer naps in the afternoon. I’m just not as wrecked by the day as I used to be.
I’ve been using MacWhisper Pro for about six months now, so I feel confident saying the changes I’ve seen are deep and structural. This isn’t a novelty bump. It’s a genuine shift in how I work and how much I can produce without burning out.
It means I’m likely to be more productive in the next twelve months than I have been in the last four or five years.
Bots off my words
I’m really looking forward to writing more books.
I’m not, however, looking forward to the holy war that feels like it’s coming.
Because there’s a second step to getting a good, clean copy out of a dictation rig like MacWhisper Pro: You have to feed the transcript to an AI like Claude or ChatGPT and ask it to clean it up for you.
The prompt I used for this piece, for instance, was: “I recorded this blog post using a speech recognition AI, so it rambles around a bit and is full of transcription errors and artifacts. Can you clean it up while keeping as close to my intended tone and content as possible?”
It’s not generative writing. It’s not even close. But for a lot of writers, it’s too much. The fear and loathing of AI is already so profound that any touch of the bots on your copy is anathema.
I wrote tech columns for ten years before I wrote novels, so I guess I’m less given to fear and loathing of our silicon friends. But I understand why my fellow writers feel that way. Part of the reason these models are so good at accurately interpreting not just what I said, but what I meant, is that they have consumed every word I ever published. They did it without my permission, and the billionaires who own and run these companies say they can’t possibly afford to pay for any of it.
So I understand the fear and loathing.
But to me, these things feel like jet engines. They’re incredibly fast and powerful, and they do amazing things. It’s just best not to think about where they came from.
Unlike Jason and Dan, who are world-class touch typists, I’m not. My typing skills are, shall we say, newspaper native, a two-fingered hunt-and-peck style at best. ↩
Good morning and welcome once again to Apple Park! We’re so pleased to have you with us to celebrate our annual Worldwide Conference. Did I forget a word there? Oh well, it probably wasn’t one of the important ones.
Today’s announcements mark the beginnings of a big week for Apple. We’re delighted to share with you our latest updates for iOS, macOS, watchOS, and some other OSes that I’ve probably forgotten about but still see once a year at the holidays.
First up, we’ve got a brand new design language that stretches across all our product lines, inspired by our blockbuster Apple Vision Pro—a device so popular its sales made this last quarter a tough compare—and we think you’re going to love it. We’re calling it Solarium, because it’s like the heat of a thousand fiery suns are searing right into your retinas. Dark mode? There is no dark mode. ALL IS LIGHT.…