As Apple hits its half-century milestone, it seems like we’re all of us waxing a bit rhapsodic about the company, its products, and their effects on our lives. So who am I to skip out on a trip down memory lane?1
Portrait of the author as a young man.
Weirdly, I was born almost perfectly in between the founding of Apple on April 1, 1976, and the release of the first Macintosh on January 24, 1984. But the former was only one of two events that occurred around that time that would go on to have a profound impact on my life. Because just over a year after Apple was founded, on May 25, 1977, release of the original Star Wars.
Oddly, those two events are intertwined at various points, not only with my life, but with each other. That’s true both in time and in space, where ultimately, these two influences would effectively bracket the San Francisco Bay Area, with Lucas’s Skywalker Ranch just north of the city and Cupertino to its south.
And the connection extends even further—the interplay between the rise of computer technology and its effect on modern moviemaking. John Knoll, the creator of Photoshop, would go on to work for Lucas’s groundbreaking visual effects firm, Industrial Light and Magic. A group within Lucasfilm would later evolve, with funding from Steve Jobs, into the animation studio Pixar (which, along with Lucasfilm, would be eventually acquired by Disney). I definitely had a wallpaper on my Mac in college photoshopped with Steve Jobs and George Lucas in it—what can I say, I know who I am.2
There are thematic ties, too. I wasn’t the only Mac fan amongst my friend group, but in the 1990s we were engaged in pitched battle with the behemoth that was Windows. It lent something to our identity, then—we were no less scrappy underdogs than the Rebel Alliance fighting back against the evil Empire.
(I can admit, from this later date, that I cast envious glances at my friends’ PCs, able to run games like TIE Fighter and Might and Magic, while I had to wait for those to come to my platform—if they ever did. As the years went on, I persevered, reading my monthly issues of Macworld cover to cover, devouring books like the Macintosh Bible and digging up weird shareware, as though I could keep the company going through my sheer persistence.)
For a large part of my childhood, both Apple and Star Wars struggled, falling upon hard times. After 1983’s Return of the Jedi, there were no more Star Wars movies. Meanwhile, Apple nearly tumbled into oblivion.
I vividly remember sitting in our kitchen one morning, listening to the news on the radio while my dad made his coffee, and hearing a dire story about Apple. My dad, knowing my enthusiasm for the company, asked if I thought it would survive—maybe the first time I felt like he’d ever asked me a real opinion on something happening in the world.
I won’t say that it had never occurred to me that it was possible Apple would cease to exist, but it was something I didn’t really have the tools to process. So, naturally, I assumed it would survive somehow, as unlikely as that seemed—as sure as there would be new Star Wars movies someday. The narrative’s stronger when you’re a kid, when you don’t really understand how the world works and your only real templates are stories.
A talk by now-Lucasfilm president Dave Filoni at WWDC 2014.
So I closely followed all the developments of those dark times: the transition to the Power Macs, the attempts to create a modern successor to Mac OS, devouring every tidbit of information with the no less fervor than I digested every new Star Wars novel. Any port in a storm.
And then in another close coincidence that is too strange for fiction, dual lights at the end of the tunnel: just as Steve Jobs returned to the company he’d founded, George Lucas announced that a trilogy of Star Wars movies was on the horizon. It seemed that faith had been rewarded and hope was once again on the horizon.3
Staying foolish
My life has always been kind of a push and pull between these two influences—forces, if you will4—of technology and storytelling: Venn diagram circles with an overlap sometimes larger or smaller. As a teenager, I both wrote and distributed some really terrible shareware on local BBSes and, for several years, collaborated with one of my best friends to publish an online magazine for sci-fi and fantasy.5
In college, I majored in English because I loved writing stories, but almost all my work experience, starting in late high school, was in tech: a nascent web company, IT work at a university library during summers and vacations, teaching fellow students about technology at my college. Freshman year, I got a reputation as the English major who would fix all the computers of the engineers on our floor—even though I was only one of a handful who had brought a Mac to college amidst the sea of beige—or, increasingly, translucent blue plastic6—PCs.
The Force is strong with this one?
Even after college, I worked in IT and web development while toiling away on my first novel. The first piece I ever had published was about Star Wars and it led to the conviction that I could get a job writing—and it just so happened that job was writing about Apple. The rest, as they say, is history.
Always in motion is the future
As this milestone has approached, I’ve wrestled with my own feelings about Apple. Last year, as I wrapped up my ten-year stretch as a columnist at Macworld, I wondered whether we should even be fans of a company. A year on, I feel even more confident in my conclusion that it’s probably unwise to allow your identity be dictated in any small part by a for-profit corporation whose needs will not ultimately be aligned with yours.
Frankly, it’s a conversation I’ve had to have about Star Wars over the years—more than once.
The truth is I still view myself as an enthusiast of Apple and of Star Wars, even today. Without the former, I wouldn’t be here talking to you. I’m not sure I could have devoted this many years of my life to writing and talking about something for which I don’t have strong feelings. And without the latter, I don’t think I would constantly be writing stories that try to capture the way Star Wars enthralled me as a kid.
Hopefully this stormtrooper at WWDC 2014 wasn’t an omen.
But being an enthusiast certainly doesn’t mean being uncritical—honestly, none are so critical as those who view themselves the true enthusiasts. Amidst the recent years’ resurgence of both Star Wars and Apple, there’s been no end of criticism—some certainly less well-founded than others—from those who profess themselves the most ardent enthusiasts.
However, if I can trot out another old trope, you either die the hero or live long enough to see yourself become the villain. That’s the knife edge Apple is poised at now; some might argue that it’s too late, that Apple has already tipped itself over onto the side of full-blown villainy.
But maybe there’s one more lesson to take away from Star Wars here: even Darth Vader managed to redeem himself in the end. You don’t have to be the scrappy underdog to make the right decision. It’s never too late to hoist the pirate flag and think different.
Although, have you seen RAM prices? Memory lane is pretty expensive real estate these days… ↩
I assume the two of them must have met at some point, but I’m frankly shocked that I can’t find any direct evidence of it. As far as I can tell, not a single photo of the two of them together exists. And isn’t that suspic—no, no it’s not. ↩
Unfortunately, sometimes the light at the tunnel is a Death Star superlaser firing. ↩
[Dan Moren is the East Coast Bureau Chief of Six Colors, as well as an author, podcaster, and two-time Jeopardy! champion. You can find him on Mastodon at @dmoren@zeppelin.flights or reach him by email at dan@sixcolors.com. His next novel, the sci-fi adventure Eternity's Tomb, will be released in November 2026.]
Museum piece. Photo: Alejandro Linares Garcia, CC BY-SA 3.0.
I am usually so focused on Apple’s present and future that I don’t spend a lot of time ruminating about its past. And yet, as its 50th birthday has approached, it’s been impossible not to think Big Thoughts about the Big Picture.
So here’s one: Apple has been remarkably consistent — across 50 years and numerous CEOs and the vast sweep of late-20th- and early-21st-century history — in a few key areas. The people change (except Chris Espinosa!), but some of the ideas have managed to stay the same. And I think that’s meaningful.
Here’s what it boils down to: Apple is a company that chooses to build the whole product, while controlling its own destiny. That was true in the 1970s, it’s still true today, and it’s perhaps the company’s definitive trait.
In the olden days…
The early personal computer market was a hodgepodge. Different companies rose and fall, all offering different devices that were essentially self-contained and proprietary—compatibility across devices was almost nonexistent. Even programs written in the same language might not run across different systems, since they might each implement the languages differently.
During those days, Apple was playing the game that pretty much everyone else does. Sure, there were some computers using the standardized CP/M operating system—you could install a card on an Apple II to let it run CP/M, even!—but mostly you got what you got when you bought the box. Apple IIs ran Apple stuff, TRS-80s ran TRS-80 stuff, the Atari 400 ran Atari stuff, Commodore PETs ran Commodore stuff… that was it.
But in the early 80s, almost the entire computer industry got flattened, and the reason was the IBM PC. Not that IBM did the flattening itself, but it had that effect: Since the IBM PC had been created using standard computer parts in order to get it out quickly, it became relatively easy for any other company to build equivalents. Its operating system was not actually owned by IBM, but was created by an upstart software company called Microsoft.
What happened next changed the entire computer market: Dozens of companies began making IBM PC compatible computers running MS-DOS from Microsoft. The generic Microsoft/Intel PC was born, and almost every other competitor was ruined. Atari and Commodore hung on for a while, but by the early ’90s, there were only pretty much two kinds of personal computers anyone would seriously consider buying: IBM PC compatibles running Microsoft software, or the Mac.
That was it. The rest of the market had capitulated. Only Apple hung on. And as someone who started writing about Apple during that time, I can tell you that nobody expected Apple to make it. Analysts either wrote that Apple should become like the other PC makers and just license Microsoft Windows, or that Apple should become like Microsoft and just license Mac OS to PC makers. Those were the choices.
Portrait of the author as a college editor. Super Grover’s crimes are redacted.
To me, this is the core of what Apple is as a company: It makes the whole product. It is not a licensee adding value, like so many of its competitors. This is an attitude that started with Woz designing the hardware and software to work together, leaving a deep impression on Steve Jobs. That impression combined with Jobs’s innate focus on creating a complete product (in an era where most computers were still sold as assemble-it-yourself “kits”) and created an enduring legacy.
People often call Apple’s obsession with owning and controlling the primary technologies behind its products the Cook Doctrine, after current CEO Tim Cook, but that’s a value that goes back to Steve Jobs. Among the more modern examples of this approach:
Safari came to be because, as the Web rose to prominence, the Mac was increasingly judged based on its performance at Web browsing, and the default Mac browser was Microsoft’s Internet Explorer. Microsoft’s allocation of Mac development resources helped determined the success of Apple’s key product. That was a no-go.
iWork (Pages, Numbers, and Keynote) exist because it means that every Mac, iPhone, and iPad can work with Microsoft Office apps and documents right out of the box, without any extra purchase required. In releasing its own productivity suite, Apple provided instant Office compatibility and no longer needed to rely on Microsoft to do the right thing with its Mac software releases.
Apple silicon itself is Apple’s reaction to being held hostage by the long-term plans of chip suppliers who didn’t have Apple’s interests at heart. Every Intel chip that appeared in a Mac came from an Intel road map that was built based on the overall needs of the computer market, of which Apple was a tiny part. Every Apple silicon chip in a Mac comes from Apple’s own product road map, and the chip improvements are based entirely on Apple’s needs and synchronized with Apple’s software-development road map.
The C1/C1X chips that serves as the cellular connection in the iPhone 16e, iPhone 17e, iPhone Air, M4 iPad Air, and M5 iPad Pro—and will eventually power every new Apple device with cellular connectivity—is a reaction to Apple’s frustration with the dominant cellular radio provider, Qualcomm. Apple can now tune its own cellular chips to its own specific needs rather than relying on the parts Qualcomm builds for the entire market.
(Are AI models a primary technology? Who knows. Apple tried to build some, failed, and has decided to pivot to use Google’s AI models… for now. But if Apple ever feels that it absolutely has to have its own AI models running on its devices and in its data centers, I have no doubt that it will spend whatever it costs to make that happen. It’s just in the company’s DNA.)
You may have your own favorite examples of Apple going its own way, and counter-examples of Apple going with the crowd. Certainly, Apple has chosen to pick its battles. The G3 iMac, for example, dumped all the proprietary connectivity that Macs used to have, and just supported the industry-standard USB. Compatibility can be valuable to Apple, to a point. But beyond that point, the company knows it must go it alone—or it’ll end up being just another face in the crowd.
Over 50 years, that’s one thing that has remained true about Apple: You never forget that you’re using an Apple product. It doesn’t do generic—not in 1976, and not in 2026.
The author (far right) at a certain Apple event 25 years ago.
It’s Apple’s 50th anniversary — you might have read something about that lately. And I’ve been writing about the company for more than half of that time, roughly 27 years if my math is correct. Companies may last a good long while, particularly when they have a track record of great products, but the writers who report on them invariably crumble to dust.
Still, my bones haven’t entirely blown away in the lightest of breezes just yet, so I figured I would weigh in with a few insights gleaned from chronicling Cupertino’s comings and goings for half my existence on this planet. Honestly, I might as well get something out of the deal.
The challenge is, you’ve probably had your fill of listicles chronicling Apple’s Best Products of All Time or the Most Memorable TV Commercials or Steve Jobs’s Most Viral Moments or what have you. I know that I have. Besides, while I know my onions when it comes to Apple, my opinion on the most significant Apple product (the iPhone 3G) or the best commercial (the sage iMac G3 serenaded by Kermit the Frog, naturally) or the most memorable thing Steve Jobs ever said (“Just avoid holding it that way”) carries no more weight than anyone else’s. In fact, there are folks whose Apple knowledge is far more encyclopedic than my own who are better equipped to weigh in on all that.
But what I can do is empty out my reporter’s notebook, with some random stories, stray observations and items I’ve largely kept to myself over the last 27 years. With tech reporting seemingly done with me, there’s no reason to keep this stuff under my hat any longer.
The occasion may call for 50 of these — one for each year of Apple’s existence — but let’s be honest: you’d stop reading after around 17, and I’d be scrapping the bottom of the tank long before we got to the last item or two. (“No. 33: Didja ever notice that Apple employed both a guy called Woz and a guy called Joz? That’s pretty weird, huh?”) So let’s stick with 10 random thoughts about Apple as the company celebrates its golden anniversary.
My Most Awkward Encounter with Apple
Back in 2001, I was handed an original iPod, not long after Apple’s press event to show off its new music player. It’s probably forgotten with time, but the MP3 players of that era weren’t very durable, and if you were foolhardy enough to take one on a run, you ran the risk of skips caused by mechanical shock. And heaven help you if you accidentally dropped one of those things.
The iPod was going to be different, Apple told us. Not only would Apple’s music player have more storage, it was going to be durable enough to survive real world use in a way that rival devices simply could not. So I decided to put that to test, probably ill-advisedly.
I commissioned a more physically active colleague to go work out with that iPod in tow, along with one very specific instruction: be especially brutal with the device. “Let’s find out just what kind of a licking this thing can take,” I remember saying at the time.
It turns out the iPod was pretty durable, though not indestructible. We did manage to damage the device, but only after deliberately tossing it from a moving bicycle. Otherwise, for a 2001-era piece of tech, it withstood a fair amount of abuse before finally succumbing to our more violent impulses. I patted myself on the back for conceiving of a handy piece of consumer tech journalism that would give readers insight into just what they could expect from an iPod in terms of durability and went about my business without giving the story another thought.
At least until Apple asked us to return the iPod.
Companies don’t always do that, as they’re happy to leave review units in the hands of publications for use as reference devices when subsequent updates come along. But occasionally, you do get asked to return the equipment, Q-from-James-Bond-style, and this was one of the occasions. But I held out hope that Apple would agree that proving just how much punishment an iPod could take was enough of a service to more than make up for the non-operable loaner.
Apple did not agree. I don’t remember the poor soul who was tasked with explaining to Apple why their once-pristine iPod was coming back in such a decidedly scuffed-up state, but whoever it was made certain to let the company the name of the dastard who so recklessly ordered the iPod beaten to a pulp. It would be many years before Apple ever trusted me with a loaner device again, and even on those occasions, the hand-off was made with decidedly sideways glances.
The part of the Apple campus I’ve never seen
I’m not a frequent visitor to worldwide Apple HQ, but I’ve been around the place a bit. I’ve even gone inside a building or two, though never uninvited, I hasten to add. I’ve had lunch at one of Apple’s on-campus cafeterias, and let me tell you after also dining at Google’s campus, your tech industry workers are being fed very well.
I have not, however, been inside the Steve Jobs Theater, which seems odd since Apple has been holding events there for the better part of a decade. Part of that’s the nature of my role in covering Apple events — I’m usually coordinating coverage and editing people’s work, and it’s easier for me to do that watching the live stream from the comfort of my office.
The closest I’ve come was in 2017, the very first time in fact that the Steve Jobs Theater hosted any product launch. I was a late addition to the coverage team on hand to look at the iPhone 8 models and the new iPhone X, and as a consequence, I was directed to watch the event from an outdoor overflow area on a nearby TV. Which is how I normally cover such product launches, only without the 90-minute commute.
I don’t know what you remember about that 2017 event — the Apple Watch Series 3 maybe or the Apple TV 4K or one of the trio of aforementioned phones. For me, it’s the smell of fertilizer baking in the warm Bay Area sun on the freshly landscaped area surrounding the Steve Jobs Theater. On the bright side, at tech events for other companies, the smell of manure typically originates from the stage, so Apple has that going for it at least.
Watching an event on a TV outside of the closed doors where the products in question are actually being launched is hardly my most traumatic Apple press event experience, though. That’s a close tie between the iPhone 6s launch, held inside the kiln-like Bill Graham Civic Auditorium, and the 2014 Apple event where I covered the iPhone 6 and Apple Watch preview announcements only to be laid off from my job 24 hours later. Good times.
My favorite Apple launch event
Look, over the course of 27 years, Apple events are going to blend together, particularly when you’ve stopped attending them in person. Nevertheless, a few stand out, especially since i was in the room where it happened.
My very first Macworld Expo in January 2000, Steve Jobs announced he was dropping the “i” from his iCEO title — basically, no longer an interim title, which seemed like a big deal at the time. I was also at the WWDC keynote where Apple held a funeral for Mac OS 9, marking the complete transition to OS X.
But c’mon — there’s only one logical choice here, and it’s the iPhone’s unveiling in 2007. Seeing Apple take the wraps off a completely new product is going to stick in the brain pan, especially since it’s one that’s subsequently stood the test of time. (Folks who were there for the Apple Vision Pro unveiling: I do not think time will be as kind to that moment.) Jobs’ pitch of a combination communication device/music player/mobile phone still resonates. Even AT&T’s Stan Sigman reading his contribution to the presentation off of index cards couldn’t dull the occasion.
My favorite Apple-inspired road trip
If you weren’t around for Apple’s pre-OS X era, it’s easy to forget what a significant shift it was away from the old Mac operating systems to the more modern design and capabilities of OS X — especially after previous efforts to update the OS went nowhere. (For us old timers, “Copland” is more than just a 1997 Sylvester Stallone vehicle or the misspelled last name of The Police’s drummer.) Apple had been working on a new OS for a while, and finally, in the fall of 2000, Mac users were going to get a chance to give it a try.
In fact, the public beta of Mac OS X was going to be revealed at that year’s Apple Expo in Paris, and I jokingly suggested to Macworld’s then-editor that it would be a hoot to send me to cover it.
“I don’t speak a lick of French,” I told him. “I don’t even have a passport. Wouldn’t it be hilarious to fly me over there and watch me flail my way through covering the event?”
“It would be hilarious,” the editor unexpectedly agreed. And that’s how I wound up getting an expedited passport, hopping on a flight to Paris and wandering about an indifferent metropolis without anything resembling a concrete game plan.
The turn-of-the-century tech boom was a hell of a time, kids.
Anyhow, I managed, covering both the OS X news and the surprise launch of the key lime iBook. That said, there was one moment of pure jet lag-induced panic that occurred moments before Steve Jobs stepped on stage to make his assorted announcements: What if, I thought, he delivers this entire speech in French, and I’ve come all this way to not understanding a blessed word he’s saying? Fortunately, whatever multilingual capabilities Apple’s CEO possessed were not on display that day, and I was able to fulfill my journalistic obligations.
My least favorite Apple keynote
Jason Snell and I used to have a running gag back in the days when print, not online, was king and we would reserve a sizable chunk of Macworld’s print edition for last-second coverage of all the Macworld Expo keynote announcements Apple was sure to make. But what would happen, we wondered, if Apple didn’t announce much of anything, leaving us with all those pages to fill and very little to write about.
Our joking Plan B: Run an article called “What Went Wrong?” featuring a picture of various Apple executives shrugging.
We came dangerously close to having to do that at the New York edition of Macworld Expo 2001 where Apple announced… well, some stuff. We got a recap of the recent Apple Store openings — hey, they were new at the time — and a lot of talk from developers showing off OS X native apps for the still-nascent operating system. The lone hardware announcement centered around new Power Mac G4 towers, punctuated by a lengthy discussion of what Apple called the “megahertz myth” to address differences in performance between Macs and PCs. Put another way, Apple’s big product announcement at that Expo was punctuated by an 8-minute deep dive on processor pipelines.
We managed to produce the necessary copy to fill those empty magazine pages that night. But it took some doing.
Apple event celebrity sightings
Attend enough Apple-hosted or -adjacent events, and you’re going to run into famous people. For example, if you walked the show floor of a Macworld Expo in San Francisco any time between 2000 and 2009 and didn’t see comedian Sinbad at some point, I’m guessing you were just popping into Moscone Center to use the restroom.
I’m notoriously bad at recognizing people, but even I can recount a couple celebrity encounters. Once, I waited in line to get in for an Expo keynote standing directly behind Adam Savage of Mythbusters fame. And during the iPhone 6s launch held in the hotbox that was the Bill Graham Civic Auditorium, I stood patiently waiting for a demo of one announcement or another — memory tells me it was gameplay on the Apple TV — when Charlie Rose big-footed his way in front of me and took my turn. Definitely the worst thing Charlie Rose has ever been accused of.
I’m told Gwen Stefani was at the 2014 iPhone launch, though I never ran into her or her apparently sizable entourage. But while U2 was busy surreptitiously downloading their Songs of Innocence album to the rest of your iPhones, they were also blowing out my ears at the same event.
Most awkward encounter with an Apple executive
Celebrity encounters are all well and good, but who’s a bigger name star than the men and women who run Apple? I don’t often rate face time with the higher-ups at the company, but there was one time where Tim Cook and I had the briefest of interactions. You will be surprised to learn it did not reflect well on me.
I was leaning against a wall in San Jose’s McEnery Convention Center, waiting for a colleague to wrap up a product briefing, when a gaggle of people strolled by, with Tim Cook at the center of the throng. For some reason, he looked over in my general direction at the same time I was watching him pass by, and that’s how I found myself in a staring contest with Apple’s CEO.
I don’t exactly have the friendliest appearance. My resting face makes it appear as if I’m trying to recall how you’ve wronged me, and if ever I try smiling, it looks like I’ve suddenly remembered. So I decided to offer some sort of gesture to convey a spirit of collegiality — I gave Tim Cook what I hoped passed for an amiable nod of acknowledgement. Judging by the mix of confusion and apprehension that flashed across his face, I don’t think I was entirely successful.
So, Tim Cook, if you’re reading this, and you’re still wondering why that glaring fellow nodded at you at that one WWDC many years ago, rest assured that there’s no ill will on my part.
My favorite portrayal of Apple in a movie
I saw 2013’s Jobs twice, which is probably two times more than anyone outside of Ashton Kutcher saw it. Both times were press screenings for a review I was commissioned to write about the movie. The first screening happened well before the movie’s release and Act Three of the picture felt so haphazard to me that I thought for sure that Jobs would be recut prior to arriving in theaters. Hence, the second screening right before the premiere, in which I discovered, nope, the movie was going to wind up exactly the same.
So Jobs isn’t my favorite picture about Apple, and I have to confess that the 2015 Steve Jobs biopic didn’t resonate with me either. No, for big-screen Apple thrills, I suggest turning to the small screen in the form of 1999’s Pirates of Silicon Valley, a made-for-TV movie staring Noah Wylie as Steve Jobs and Anthony Michael Hall as Bill Gates. (John DiMaggio — TV’s Bender — plays Steve Ballmer, and sadly, we do not get to hear “Developers, developers, developers” in the Bender voice.) Pirates of Silicon Valley isn’t the least bit accurate, but it’s a good character study that has something to say about ambition and our impulses to create.
If there’s a runner-up, I’d steer you toward Golden Dreams, a short video that used to run in the part of Disney’s California Adventure that now houses the Little Mermaid ride. There, you can look in as two seemingly random guys named Steve assemble a rudimentary computer while Whoopi Goldberg looks on, pointedly taking a bite out of an apple.
Goofiest Apple product of the last-half century
By this point, it’s probably clear that I find the off-beat aspects of a company’s history to be just as vital as the landmark hits that everyone talks about. I think we all should be serious about our work without being too serious about ourselves, so the things that are going to stand out to me about Apple’s first 50 years are going to reflect that. And occasionally, Apple has had some fun, too.
How else to explain the moment in 2004 when Steve Jobs — co-founder of the company, lauded visionary, subject of many a profile attesting to his business savvy — stood up in front of a packed house and introduced the world to iPod Socks? Jobs is fully committed to the bit, hailing the socks as a “revolutionary new product.” A hint of a smile flashes on his face as he tries to convince the world that, yes indeed, they need to swaddle their music players in brightly colored socks. “They keep your iPod warm,” Jobs insists, and you might for a moment feel like he actually means it.
We can talk about great Apple products and shake our heads at the few missteps. But life is about fun, and there’s no other way to describe iPod socks.
Most symbolic photo of my time covering Apple
Let’s end by circling back to the original iPod — the launch event, specifically. There’s a photo that makes the rounds in my circle of associates, pulled from the launch event video where the cameras have cut to the crowd. And there, you can clearly see Jason Snell watching as the iPod is unveiled. Seated next to him is Rick LePage, Macworld’s editor in chief at the time, and Jon Seff, another Macworld editor.
I’m there, too, though you wouldn’t know it from that shot. For a long time, I assumed I had been sitting next to Jason, so that I was cropped out of the photo — kind of like a real-life version of that Nathan Fielder meme — “Out on the town having the time of my life with a bunch of friends. They’re all just out of frame, laughing too.” — only in reverse. Here, it’s just me who’s been cropped out of the shot, having the time of my life.
And that seemed like a fitting way to sum up my time covering Apple. The company announces something significant, and I’m right there, if only slightly out of the shot.
Of course, that’s actually not the case. In fact-checking this article, we discovered that I am not seated next to Jason, but rather in the row behind him. And yes, we have the photos to prove it.
Jonathan Seff, Rick LePage, Jason Snell, Kristina De Nike, and Philip Michaels, among others, at the iPod launch event in 2001.
So as it turns out, I’m not as peripheral to this Big Moment in Apple History as memory had once dictated. Turns out Apple can still surprise us all after 50 years, even those of us who’ve seen it all.
[Philip Michaels has been writing about technology since 1999, most notably for Macworld and Tom’s Guide. He currently finds himself between jobs, so if you need someone who can string a few sentences together (or make your sentences read a lot better), drop him a line.]
When I saw my friend Antony Johnston’s post on Six Colors, I instantly thought, “yeah, me too.” And as it happens, the very Mac model that changed Antony’s life put me on an entirely new road, too.
Just before I got my journalism degree in 1984, a professor named Jim Haynes sat me down and warned me that I would have more trouble finding a job than almost anyone in my class because I have low vision. I choose to believe that he meant it kindly, a warning to get ahead of any potential employers’ doubts, rather than as a pessimistic prediction about my future.
But he was right. My job search was painfully long, and I realized that at least part of the struggle had to do with the expectation that young communications specialists working for non-profits or government – a niche I thought I could play in – needed to physically paste up newsletters, brochures and other typeset publications. I’d already learned how unsuited I was for that during a college internship, what with the need to cut straight lines of galley copy and wield an X-acto knife on rubylith. I simply wasn’t equipped to do that sort of visual work.
Somewhere along the way, I went to an Apple demo of something called “desktop publishing.” With a Macintosh computer and a high-resolution printer called a LaserWriter, you could design, lay out and print a complete publication — no knives required. When I arrived for the demo, I was intrigued. By the time I left, I would have sold a kidney for a Mac-LaserWriter combo.
In my unemployed state, the only available source of funds was my parents. Ever the practical sort, they suggested that I learn more about what I now knew as DTP, before they would be willing to hand over more than $6,000 for my pipe dream.
So I rented my first Mac (a 512Ke), a copy of PageMaker 1.2, and an external floppy drive. The guy I rented it from, Robert Jagitsch, would go on to found PowerLogix, a company that sold Mac processor accelerators. I used to run into him at Macworld Expo in the 90s. But just then, his stock of Mac stuff for sale or rent appeared to live in the trunk of his car.
Without a LaserWriter, I couldn’t do much more than teach myself PageMaker. But my local AlphaGraphics offered laser prints for $1 a page. It didn’t take me long to realize I might be able to make desktop publishing work as a freelance business.
Pretty soon, my mom – who had given my sister a used VW Rabbit during college – agreed to fund a brand-new Mac Plus. It was my equivalent “welcome to adulthood” gift. I added PageMaker and a SuperMac DataFrame hard drive that cost an eyewatering $625 for 20 megabytes.
I launched the publishing business, creating everything from brochures to fancy reports for graduate students to newsletters for a city council member. AlphaGraphics was still my source for laser prints, but I quickly fell in with a group of interlocking businesses that offered scanning, full-service printing and access to Linotype typesetters that offered 1200 dpi output, versus the LaserWriter’s 300 dpi.
Eventually – four years out of college – I landed my first full-time professional job. With a Mac Plus on my desk, I edited and laid out monthly trade magazines for enthusiasts of supercomputers, DEC minicomputers and various UNIX systems. Despite a solid portfolio of published writing, I could never have talked my way into that gig without my Apple desktop publishing skills. Those years I spent at home cranking out newsletters had also made me a pretty good Mac system administrator and troubleshooter – skills that have followed me throughout my career
Yes, here in our universe, Apple is celebrating its 50th anniversary. A milestone! The company is looking back on its success, its technology prowess, and the way it’s made us all willing to just say “AirPods” like that’s a set of words that makes any kind of sense.
But our universe is only one of many, and while it may be the 50th anniversary of Apple in several of those as well, the company hasn’t always been as successful—or at least as successful in quite the same way—as it has been here.
For example, did you know that on Earth 1208⍺-X, Apple never abandoned cat names for its operating system? They’re currently on Mac OS X 10.21 Norwegian Forest Cat. Meanwhile, on Earth 9876t-♉︎, the Pippin is the number two console, right after the Intellivision. And on Earth 632r-⍴ everybody wears iPod Socks. Nobody’s quite sure if it’s ironic or not.
All of these worlds are like ours, but ever so slightly different. And just in case you think the grass is always greener on the other side of the quantum fence, well, be careful what you wish for. As much as some people might deride Liquid Glass, be glad you don’t live on Earth 9w4598-Ω, where Apple really ran with that whole “lickable” interface thing. Computing has never been so sticky.
So let’s take this opportunity to fire up the old mulitversal radio and see if we can’t catch some dispatches from our nearby universes and see how Apple is doing there.
[static sounds]
Earth 0101010-λ
To celebrate its 50th anniversary, Apple today released its most groundbreaking product in decades, the Orb.
“Nothing is more iconic than the shape of the sphere,” said Apple CEO Jony Ive, appearing via towering hologram. “It has no beginning, no end, and speaks to where we all first issued from.”
“We think the Orb will be a big hit,” said Apple senior vice president of worldwide marketing Greg Joswiak, visibly sweating. “Our customers see whatever they want to see in it which means it can truly be any…”
[static sounds]
Earth Performis-18173U
…Apple today celebrated its 50th anniversary with the release of its most powerful computer yet, the Macintosh Quadra 3700X/II. Powered by an amazing 69050 Motorola processor running at speeds of up to 700Mhz with an astounding of 1GB of RAM and 200GB Western Digital hard drive, the 97300xfs/II will be the workstation of choice for high-end graphics applications. Its sturdy tower comes in a fetching beige, features 17 SCSI ports, and begins at just $8,999…
[static sounds]
Earth 1293857L-Γ
…and Apple CEO for Life Steve Wozniak today kicked off the 27th annual Segway Polo World Cup in Cupertino’s Steve Jobs Memorial stadium, as teams from across the globe vie to become the latest champions of the vaunted sport that has become a Silicon Valley phenomenon…
[static sounds]
Earth #000000-Δ
…would have been the 50th anniversary of Apple Computer. The now defunct company was acquired in 1997 by Dell Computer and shut down, the money returned to its shareholders. Dell, meanwhile, continues its innovative sales strategy of selling laptops by the pound…
[click]
Annnnd that’s about enough of that. Look, I won’t say that all of those universes are unquestionably worse than ours. Just as a random example, in not a single one of those other universes did Apple gift anybody odious a big golden trophy. I mean, you can only imagine what the rest of those universes think of us.
Anyway, with half a century under its belt, it’s time to start thinking about what the next 50 years might hold. I don’t want to spoil anything, but, well, better stock up on iPod Socks.
[Dan Moren is the East Coast Bureau Chief of Six Colors, as well as an author, podcaster, and two-time Jeopardy! champion. You can find him on Mastodon at @dmoren@zeppelin.flights or reach him by email at dan@sixcolors.com. His next novel, the sci-fi adventure Eternity's Tomb, will be released in November 2026.]
In addition to my two pieces on The Verge this week, I’m also on the Vergecast talking to David Pierce about Apple’s past, present and future:
On this episode of The Vergecast, we begin by stepping back a bit to ask a big question: How is Apple doing right now? Obviously, by many measures, Apple’s doing great — trillion-dollar company and whatnot — but this is a company that has long taken pride in building better software, better hardware, better everything, and doing it in a better and cooler and more responsible way. Jason Snell, a longtime chronicler of all things Apple, joins the show to do a modified version of the annual Six Colors report card about where Apple stands right now.
It was a great conversation, and nice to talk about where Apple is going, given all the history that I’ve been writing about for the last few weeks.
It’s a famous story on its way to becoming legendary: Apple cofounder Steve Jobs was pushed out of Apple in 1985, spent more than a decade in the wilderness, and then returned to Apple in 1997 to save it from bankruptcy and transform it into one of the world’s most valuable companies.
That’s true, so far as it goes, but this interregnum is too often simplified as when Apple CEO John Sculley got rid of Steve and ruined the company. And that’s really not true. Not only was the Jobs who was ejected from Apple completely unprepared to run the company (as his disastrous but educational years at NeXT would prove), but the Apple of this period had some real accomplishments.
From making necessary changes to the Mac to the creation of the PowerBook, Apple didn’t simply weather the 12 years without Jobs. The company made shifts, adaptations, and decisions that would become foundational to its future. Were there missteps? Most definitely. But ignoring Apple’s successes over those dozen years undermines the truer, deeper story of how Apple survived to become the behemoth it is today.
In 1988 my high school form tutor, who was also head of the art department, got a Mac Plus. It was the only one in the school, as the computer room was all BBC Micros. In fact, so he said, it was one of the only school-owned Macs in England. It was kept in a locked office room, annexed off his classroom.
I loved playing computer games, and like all kids, I’d messed around with typing in BASIC programs from magazines. But whenever I strayed beyond the simple commands – LOAD, SAVE, PRINT, GOTO – I was out of my depth. I’ve never been able to get my head around DOS-like command line interfaces, let alone programming languages. They just don’t make sense to me, I’m all at sea.
(I’ve sometimes wondered if it’s because I always looked at computers as a tool, a way to do something, rather than a thing to do.)
So I don’t know why my tutor showed off that Mac to me, of all people. But I was gobsmacked by the visual interface and the tangibility of its spatial permanence model. ‘This icon here is your file. This window represents the space inside a folder. If you move the file into the folder, it will still be there, in that same visually-defined place, when you look inside again later.’
I know that sounds like the simplest, most obvious thing now, but in the 1980s it really wasn’t. Crucially, unlike a command line, it made sense to me.
So I was sold on the interface. But then what really blew my mind were the programs you could run on this thing. MacPaint. MacWrite. PageMaker. And the fonts! 12 different fonts you could place anywhere, change their size, make (some of) them bold or italic… again, this is simple and obvious stuff now, but not then.
For some reason, I don’t think any other pupils really took to that Mac. But I was hooked, and spent a lot of time in that cramped office room. I proceeded to use the Mac Plus’s tiny mono bitmap screen, paltry RAM, and single floppy drive to design and lay out two school magazines, one edition of the sixth-form ‘zine, and several judges’ pamphlets for the annual music and drama festivals1 – plus a bunch of, um, extracurricular stuff for my regular RPG gaming group: character sheets, combat resolution tables, equipment lists…
The ironic thing is, at no point did anyone tell me that what I was doing with this Mac could be a career. My work experience at the local newspaper had shown me that ‘layout’ was something done by chain-smoking men using bromides, cow gum, and rubylith – not computers. The very thought! So after flunking my A-levels (too much partying, not to mention fooling around on that Mac), I was a little unmoored and took the first office job I saw that sounded vaguely interesting: selling stationery.
I was an OK office drone, but my creative bent was obvious to everyone. My free time back then was dominated by games, music, and art. So, encouraged by my boss to go back to school and do something creative, I flicked through the local art college brochure… and found a course called ‘graphic design’. It even mentioned using Macs. Suddenly, I was back in that annexed room, designing a school magazine, and I knew what I wanted to do.
Perhaps the most amazing thing is how small the window of time and opportunity was where all of this could happen. Much earlier, and Macs barely existed; much later, and they were already in professional use everywhere. I was lucky enough to be right in that sweet spot.
I’ve been a professional writer for 30 years now, full-time for 24. That’s how most everyone knows me. But for almost a decade prior to that, I was a graphic designer at various agencies and publishers, eventually specialising in magazines. It was working in those places that gave me access to the net, and an online community that encouraged me to take fiction writing seriously. (Shout-out to alt.cyberpunk.chatsubo!)
There’s a whole chain of happenstance and chance events, too long to go into here, that led to me eventually being published. But if you follow it back far enough, that chain started with my form tutor introducing me to a strange new computer, which changed my life.
[Antony Johnston is a multi-award-winning, New York Times bestselling author of books, videogames, graphic novels, and more. Can You Solve the Murder? is available now in all good bookstores and online.]
Jason and Myke tell the story of Apple’s origin. It emerged from the unique environment of the Santa Clara valley suburbs of the ’70s thanks to the particular genius of its two co-founders and some surprising help they got along the way.
After I wrote my Wall Street Journal review of David Pogue’s excellent Apple: The First 50 Years (Kindle, Kobo, Apple Books) my editor asked for a sidebar recommending other books about Apple. I consulted my own collection and also asked a few of my friends.
If the 50th anniversary celebrations and talk have made you curious about Apple history, there are a lot of books out there. Here are some recommendations:
West of Eden (1989) by Frank Rose. A recommendation from Stephen Hackett, this book focuses on Steve Jobs hiring John Sculley, which in turn led to Steve Jobs’s own ejection from Apple. (Amazon, used.)
Insanely Great (1994) by Steven Levy. This is the definitive story of the original Mac, placed in the context of the 1980s personal computing revolution. Levy, whose 1984 book Hackers is an astounding history of the early days of computing, gets at the heart of what made that original Mac, and the original Mac team, special. (Amazon, Kobo, Apple Books, used.)
Infinite Loop (1999) by Michael S. Malone. If the year of publication doesn’t tell you what this is about, the subtitle will: “How the World’s Most Insanely Great Computer Company Went Insane.” Recommended by John Siracusa, this is the story of Apple falling apart in the 1990s. (Amazon, used.)
On the Firing Line: My 500 Days at Apple (1999) by Gil Amelio and William L. Simon. Of course Gil Amelio’s tell-all about his brief tenure as Apple CEO is self-serving. And yet I enjoyed reading it, because I believe that late-90s Apple was just as messed up as he describes it, especially when it came to the utter failure to replace classic Mac OS that led to Apple buying NeXT and bringing back Steve Jobs. Was Amelio a bozo, like Jobs apparently claimed? Maybe, but you can’t deny that he was there at a pivotal moment and made the single most important decision in Apple’s history. (Used.)
Apple Confidential 2.0 (2004) by Owen W. Linzmayer. Prior to the publication of David Pogue’s book, this was probably the best collection of stories about the history of Apple. It’s still an entertaining read. (PDF, used.)
Revolution in the Valley (2011) by Andy Hertzfeld. One of the core members of the original Macintosh team has a lot of amazing stories to tell. We think of the tech industry today as being corporate, but the original Mac was almost a countercultural object. (Amazon, Kobo, Apple Books, used.)
The Perfect Thing (2006) by Steven Levy. Levy does his “Insanely Great” thing again, but this time about the creation of the iPod. You may think, well, the iPod’s pretty dated technology now, why does it matter? But this book gives you some clear insight into the entire product development process in the early days of Steve Jobs’s return to Apple. (Amazon, Kobo, Apple Books, used.)
Creative Selection (2019) by Ken Kocienda. I’m not convinced that the definitive insider history of the creation of the iPhone has been written yet. But between Pogue’s book and this account from one of the creators of the original iPhone keyboard, we’ve got at least some good tales from that vital period. Here’s my original review. (Amazon, Kobo, Apple Books, used.)
Apple in China (2025) by Patrick McGee. This is the definitive book of the Tim Cook era, at least so far, but it also covers as far back as engineering decisions made right after Steve Jobs came back to Apple. Even if you’re not interested in the Chinese angle, this book is worth reading because it reveals how Apple became and remains a titan of manufacturing, which is why it seems capable of building products nobody else can build. (Amazon, Kobo, Apple Books, used.)
Steve Jobs in Exile (coming May 2026) by Geoffrey Cain. A detailed look at Steve Jobs after he left Apple, including everything that went wrong at NeXT—and how it made Jobs a better CEO when he returned to Apple. This book isn’t out yet, but I’ve read it and it’s quite good. (Pre-order: Amazon, Kobo, Apple Books.)
(Pro tip: The used books are really cheap, and it’s kind of fun to read an old, beat-up book when thinking about Apple’s history.)
I have a mostly “love/not-hate” relationship with the Medications feature in the iPhone Health app. Having accrued and had treated a variety of conditions over the years, I found Medications a welcome addition in 2022. You can add drugs you take, the frequency (or as needed), and set them to a schedule. Then you receive a notification at the time you set, plus a reminder.
While I’m generally good at “medication adherence,” I’m not perfect. For many drugs, clinical research is based on regular administration and staying on a schedule. In some cases, you can injure yourself or reduce the effectiveness of a medication if you take it erratically, sometimes even missing a few doses, as with antibiotics or antivirals.
Medications is an oddball feature, though, as it’s kind of shoehorned into Health, and doesn’t use the normal Notifications system for alerts. I am sure that is in part because of the unique elements of ensuring reminders occur and recur. But also, it’s because your medication schedule is akin to time-of-day reminders: they should always occur at the requested time.
When you travel across time zones, that’s where confusion can emerge. While on a flight, you may have seen a notification that says “Time Zone Changed,” which suggests you need to check your medication schedule. You may see this for each time zone you pass through. Tap it, and you’re taken to the Medications view, where you can tap to rewrite the time zone to the local one—that is, 8 am PDT becomes 8 am MDT, GMT, etc.
This alert should appear on your iPhone (left) and Apple Watch to let you know you need to adjust your schedule. Tapping takes you to Medications.
But I had the opposite problem: traveling west to east the other week, I experienced the failure of negative knowledge—I wasn’t alerted about the time zone change and wound up missing a dose of meds.1 I haven’t had this happen since I started using Medications and traveling, so I don’t know what failed.
Here’s the sequence of what happened (or didn’t):
I flew across three time zones, from Pacific to Eastern. I was not alerted by Medications about the time zone change.
I arrived in Boston, and with Settings > General > Date & Time’s Set Automatically option enabled, my iPhone and Apple Watch updated to EDT.
The next morning, I forgot for the first time in seemingly years to take my morning meds.
Later that morning, at 11 am EDT (8 am PDT), I must have received an alert that I missed. Medications alerts aren’t persistent in quite the same way as other notifications.
It was only late that night that I realized what had happened. Looking in Health > Medications and swiping way down to Options, I checked that Time Zone Change was enabled. It was. However, my whole schedule was three hours off. There’s no manual “reset to current time zone” button.
The workaround is to go to Settings > General > Date & Time, disable Set Automatically, switch to the old time zone, then to the new one, and then re-enable Set Automatically. At that point, I received the alert from Medications and was able to visit the app to approve changing the absolute time (8 am PDT/11 am EDT) to the relative time (8 am EDT).
Clearly, Medications has room to grow in its time zone support. Because of our body clocks, we may want to keep our medications on the absolute time: if you travel 12 time zones, you probably want to be sure you take your doses of daily meds about 24 hours apart. But there’s no good way to adjust Medications while traveling unless the alert is triggered. Calendar added an option for Floating events years ago, where they were fixed to a time of day rather than a time zone. Some kind of opposite-to-floating option or time slider needs to be added to make Medications more travel friendly.
[Got a question for the column? You can email glenn@sixcolors.com or use/glennin our subscriber-only Discord community.]
I define “negative knowledge” as information provided to you about something that doesn’t happen. Most alerts tell you something did or should happen; I often find knowing that something that should have happened, didn’t, is as or more important. Cf., Sherlock Holmes’s famous “curious incident of the dog in the night-time.” ↩
When you think of Apple, you probably think of the iPhone, or maybe the Mac, or perhaps you’ve got fond memories of the iPod. But Apple’s 50-year run of creating tech products that people fall in love with — sometimes a lot of people, sometimes just a hardy few — would never have happened if it weren’t for a product and platform that’s been gone for decades.
Apple would never have made it if it weren’t for the Apple II, the company’s first hit product and the first one to generate the amount of devotion we’ve now come to expect from fans of Apple’s products. Their slogan was, and still is, “Apple II Forever!”
Look at The New York Times putting periods into AI. You fancy.
After some pretty big hoopla about the service that let you generate dancing penguins on the moon or other works destined to be cinematic classics, shuttering it is more than a little embarrassing and not just for OpenAI.
Just three months ago, OpenAI and Disney signed a three-year licensing deal allowing Sora users to generate videos with Disney characters like Mickey Mouse, Cinderella and Yoda.
That deal was for $1 billion. I feel like I put more thought into the longevity of a $2 app before I click “Buy” than Disney did here.…
My thanks to Unite Pro for sponsoring Six Colors this week.
Safari web apps and PWAs are a nice start, but they’re limited. Browser tabs are messy. And most tools for turning websites into apps still feel more like wrappers than real Mac software.
Unite Pro takes a different approach. It turns any website into a fast, isolated Mac app built specifically for macOS — with support for Window, Sidebar, and Menu Bar modes, deep visual customization, smart link forwarding, and native enhancements like dock badges, meeting alerts for Google Calendar and Outlook, AI overlays for ChatGPT, Gemini, Grok, and Claude, and more.
What makes Unite Pro special is how much control it gives you. You can remove distractions, force dark mode on sites that don’t natively support it, apply custom scripts and styles, and shape each app around the way you actually work — while keeping sessions, cookies, and permissions separate from your browser.
Six Colors readers can get 20% off Unite Pro this week with the code SIXCOLORS. Learn more and download at bzgapps.com/unite.
It’s the end of an era: Apple has confirmed to 9to5Mac that the Mac Pro is being discontinued. It has been removed from Apple’s website as of Thursday afternoon. The “buy” page on Apple’s website for the Mac Pro now redirects to the Mac’s homepage, where all references have been removed.
Apple has also confirmed to 9to5Mac that it has no plans to offer future Mac Pro hardware.
A quiet end to what was once the flagship of the Mac product line. But time comes for us all.
Over the years, as laptops rose in prominence and other Mac desktops added power, the Mac Pro increasingly became a niche, high-end device. After the disastrous trash-can Mac Pro design, Apple made good on a promise to return the Mac Pro, and shipped a new take on the “cheese grater” enclosure. But the move to Apple silicon really killed the product dead, since Apple’s modern chip architecture doesn’t support external GPUs, which was one of the last reasons to buy a Mac Pro.
In the interim, the Mac Studio has become the top-of-the-line desktop. It’s great. RIP to a real one, but it’s time for us all to move on.
Public spaces like Cosm might be a good content fit with Vision Pro.
The Apple Vision Pro feels like a product that’s waiting for the world to catch up, but the reality is closer to the opposite. The world is waiting for a reason to use it and that reason hasn’t quite shown up yet.
There’s very little wrong with the hardware. Apple built something that works in a way first-generation devices rarely do (says the guy old enough to have bought a Newton at launch) with displays that feel natural rather than novel and an interface that disappears quickly enough to let you focus on what you’re seeing.
The problem comes the moment you take it off. There isn’t a strong pull to put it back on. It’s impressive, even remarkable in bursts, but it doesn’t yet fit into a daily rhythm. That’s not a hardware problem. It’s a content problem, and more specifically, a cadence problem. Apple has treated immersive content like a prestige release schedule, carefully curated and spaced out, which works for television but not for behavior. If you want people to build a habit around something, you need volume and consistency, not occasional brilliance. Right now, Vision Pro feels like something you check in on rather than something you live inside, and that distinction matters more than anything on the spec sheet.
Neal Stephenson’s skepticism lands because it recognizes that gap. If the content never reaches a point where it becomes necessary, the headset remains optional, and optional devices rarely scale. What’s interesting is that the missing piece isn’t hypothetical. It already exists in a different form, outside of Apple’s ecosystem, and it’s showing up in a place that Apple understands better than most companies: people paying for experience.
Cosm is the cleanest example of that. It’s easy to dismiss it a high-end gimmick, a giant dome with a better screen, but that misses what’s actually happening inside those venues. People are buying tickets, planning nights around it, treating it as something closer to attending a game than watching one. The technology matters, but the behavior matters more.
Cosm is already generating meaningful revenue and drawing repeat customers, which tells you this isn’t just novelty value. It’s tapping into something real, the idea that proximity, or at least the feeling of it, has value even when the event is happening somewhere else.
The challenge for Cosm is that scaling that experience is difficult. These are expensive builds that require the right locations, the right partnerships, and enough capital to expand without diluting the quality that makes them work in the first place.
That is exactly the kind of problem Apple has solved before. It’s not just about having the cash, though Apple certainly has that. It’s about having the discipline to build a system that can expand without losing its identity and the distribution to make it visible at scale. If Apple owned something like Cosm, it wouldn’t just be a set of venues. It would be a front door. You could put an Apple Store in the lobby and it wouldn’t feel forced. It would feel like a natural extension of the experience, a place where people encounter the hardware in the context of something they already understand.
From there, the path to the home becomes clearer. Vision Pro, or whatever lower-cost version follows, doesn’t need to stand on its own as a category. It becomes an extension of something people have already bought into. The idea of watching a game “from somewhere else” is no longer abstract because they’ve already felt it in a room with other people. At home, it becomes a different version of the same experience, missing the crowd and the waiter, but gaining convenience and access.
The critical shift is in how Apple approaches rights. Trying to own sports outright is a losing strategy. The costs are too high, the competition too entrenched, and the fragmentation too deep. Apple has made smart moves with MLS, F1, and selective partnerships, but doubling down on exclusivity won’t unlock this. The better path is to work alongside the existing ecosystem. Install Cosm camera systems at major events, not as replacements for the broadcast but as an additional layer. Let networks and leagues sell that immersive feed as a premium product, with Apple taking a share for the technology and distribution. It’s additive rather than competitive, which makes it easier to scale and harder for partners to resist.
Apple has always been at its best when it connects behavior to technology in a way that feels inevitable in hindsight. Right now, Vision Pro still feels like a solution looking for a problem. The problem, or more accurately the opportunity, is already there in how people respond to immersive sports experiences. Cosm has shown that people will pay for that feeling. The hardware is close enough to deliver it at home. The gap is building the bridge between those two things in a way that feels continuous rather than experimental.
If Apple gets that right, the conversation around Vision Pro changes quickly. It stops being about whether people want to wear a headset and starts being about what they’re missing when they don’t. That’s the point where adoption tends to take care of itself.
[Will Carroll was an early writer at Baseball Prospectus who covers injuries at Under the Knife and talks about them on Injury Territory. He frequently co-hosts Downstream with Jason Snell on Relay.]
Harry McCracken has put together an amazing oral history of Apple’s earliest days. You should read the whole thing, but this anecdote from Chris Espinosa, who still works at Apple after all these years, is the part that made me laugh the most:
I was sitting there in the Byte Shop in Palo Alto on an Apple-1 writing BASIC programs, and this guy with a scraggly beard and no shoes came in and looked at me and conducted what I later understood to be the standard interview, which was “Who are you?” I said, “I’m Chris.” … Steve Jobs’s idea back then of recruiting was to grab a random-ass 14-year-old off the streets.
The fifth season of Apple TV’s “For All Mankind” premieres March 27—really, the evening of March 26 for those of us on the West Coast. For the last few years, Dan and I have been reviewing episodes on our “NASA Vending Machine” podcast and I’m excited to have the show back.
As always, “For All Mankind” is about taking big swings. There’s always a dramatic, history-changing moment or shocking twist that’s not too far away. Set in an alternative past where the Space Race kept going after the Soviets landed on the moon (yep!), season four took us to a 2003 where Mars colonists sought more autonomy by hijacking an asteroid.
This season, which takes place in 2012, is still primarily set on Mars, though there’s also some space adventure in the offing. Apple tech fans will enjoy that we’ve finally reached the iPhone era, though the iPhones on “For All Mankind” are a little thicker than the ones we remember, and they might actually be Newtons. There are also a lot of early-2010s iMacs on display.
While the first episode has to do a lot of work reminding you of what’s happened recently and setting up the new power dynamics at play this season, subsequent episodes get pretty intense, pretty fast. At times the show plays with police procedural, mystery story, even car-chase adventure… familiar TV genre stuff, except it’s all on Mars! Mireille Enos of “The Killing” plays an important new role as an investigator for the Mars Peacekeeping force who is suspicious that several different crimes might have been committed out on the surface. There are also a bunch of returning faces, some expected and some quite surprising. (And also, yes, Joel Kinnaman is still in the show even though Ed is now basically in his eighties.)
I’ve seen the first six episodes thus far, so I don’t know where it’s all going, but I’ve sure enjoyed the ride. “For All Mankind” continues to use its alt-history setting to tell dramatic, almost operatic stories that can also disturbingly have relevance to current events in our own world.
Claude Code takes advantage of a real development environment.
I’m pretty skeptical that Apple’s new Siri-wrapped Gemini will be able to accurately and reliably assist with automation. Gemini will be the foundation to Apple’s foundation models, but there’s no there there. Apple has no well-documented, debuggable, inspectable system to execute automation with, unless you count ancient and inscrutable AppleScript, and you shouldn’t.
Sure, LLM chatbots will spit out code (even AppleScript!) if you ask them to, but it might not work. It gets substantially worse when you’re asking LLMs questions about Shortcuts.
Go ahead and ask any chatbot to describe how to make a Shortcut to perform some automation that you’ve been wanting to do and then try to assemble what it suggests. It’s extremely tedious, prone to user error, and isn’t in any way guaranteed to work even when it’s all put together.
Agents that hook into development environments are much better than a bare chatbot because they can inspect, run, and debug the code they are generating. They aren’t perfect, but if you have an agent like Claude Code hooked up to an development tool like VS Code and start describing some Python script you want, it’ll execute and iterate until the output is what you asked for.
If humans don’t have access to documentation, to actionable debug output, logging, the ability to bypass/ignore actions as part of testing, and the ability to copy and paste snippets of code, then how can the new Siri do it?
Right now, Shortcuts works with AI models by passing some input and then receiving the output. When something goes to the model, the model transforms the data, and delivers a result back to Shortcuts. That’s a non-deterministic workflow, so any change to the model, or even just randomness in general, can produce different output. This means you can’t reliably troubleshoot or adjust it without introducing uncertainty in what new outputs you’ll get.
When working with an agent to assemble automation in an IDE, the code it builds is deterministic, so it will keep working even if the model changes. Not everything you want to automate requires LLM functionality when it runs, but not everything you automate should require hours of labor to fabricate the deterministic workflow version of it.
I really hope that the magic of new Siri isn’t going to be that it will just do things with bare actions and App Intents, magically, without any user-accessible process, or as a blob inside of a Shortcut you need to make. If I ask Siri to reorder a list, and it doesn’t do it correctly, I want to be able to access the scaffolding it created to see what went wrong, not just keep asking Siri to do it again in slightly different ways until I get output I like.
If Siri doesn’t produce anything inspectable, or it produces a Shortcut, then there’s not much work humans or AI can do to fix things.
AI cut below the rest
The problem the Shortcuts app is supposed to solve has never been solved, because no one really knows how to use Shortcuts unless they become a Shortcuts expert. Shortcuts is user-friendly in appearance, but not in practice. It’s meant to welcome people who don’t know anything about programming with its friendly drag-and-drop interface, and searchable actions panel.
Unfortunately, the names for actions don’t always say what they do, and the documentation is often a vague piece of filler that’s frequently reused for more than one Shortcut action. Even experienced programmers can get flummoxed when they try to search the available actions for seemingly standard functions, like reversing a list.
Magic connections are magic, until your script gets any longer than the length of your screen and you need to start dragging actions around, inevitably breaking connections and making unintended ones. With a text-based script you’d have to keep track of the names and spelling of your variables, but they don’t change out from under you if you add more lines of code above or below them.
You can’t do one of the most simple, and useful things in scripting, which is commenting out (ignoring/bypassing) something to test or evaluate alternatives.
A lot of the time, when people are using Shortcuts, they’re relying heavily on the run shell script action to do actual programming that lets them write normal, vanilla code, or ssh’ing into a server from iOS to do the same thing. It’s nice that Shortcuts can do that, but shell scripts aren’t cross platform, and ssh’ing into a server is in no way accomplishing Shortcuts’ mission.
Without logging, you can’t ask Siri why your automation that was supposed to run in the middle of the night didn’t run. Maybe it was a permissions issue that was never raised when the shortcut was created. You, and Siri, just don’t know.
AI rising tide lifts all boats
Again, Apple doesn’t have to do these things just for humans, or just for Siri. They are in no way mutually exclusive.
If the concern is that Shortcuts shouldn’t be like a programming language, with tracebacks, and logs which would put off “normal people” then just remember that “normal people” don’t really use Shortcuts. They ask a chatbot to just do it, and Siri, as Apple’s chatbot, could take advantage of those fiddly, programming bits and perform its role better, in a way that was auditable.
I have seen people make frantic posts on Mastodon about how AI is deskilling programmers, but the beauty of Shortcuts is that Apple already applies the deskilling at the factory.
[Joe Rosensteel is a VFX artist and writer based in Los Angeles.]