Is it age? Obliviousness? A delayed mental unacuity from my heart surgery last November? Or could it be a Shazaam/Berenstain Bears scenario? All I know is that the first version of this article took Apple to task for including a menu in the Phone app for macOS 26 Tahoe that, in fact, was there all along.
The Phone app introduced in Tahoe for the Mac (and on iPadOS 26) made it much easier to deal with telephony across your devices. Instead of cramming phone features relayed through your iPhone into FaceTime, you can use a full-fledged Phone app. I have been a big fan of it, since I spend most of my working hours standing in front of a desktop Mac. I far prefer using my hard-wired USB headset for calls than AirPods or, gasp, holding my iPhone to my ear!2
After months of using the Phone app successfuly, I suddenly couldn’t get audio input to work. The Phone app has minimal controls and—I thought—no option to select audio input or output, which, in other apps, means the system selection rules apply.
I use several Rogue Amoeba apps and installed the latest version of SoundSource to see if that helped. Maybe there was an audio routing problem? But no: no settings were active and quitting the app didn’t change the input problem.
I can see myself talking.
Checking the Sound pane in System Settings, I could see that the correct microphone was selected for input and, crucially, showed that it registered me speaking in the “Input level” section as I tested it. I could also be heard on Zoom and Google Meet, and could record audio locally.
Clearly, something else was at work! I called up my old pal Jeff to do some testing. (He’s also a technology writer, so we trade off troubleshooting.) After trying several things, I launched FaceTime by clicking the camera icon on the Phone lozenge that appears by default in the upper-right corner of your display during a call.
Jeff still couldn’t hear me! Interesting. FaceTime’s input and output controls are ancient and mysterious. Audio and video are both controlled from the (inappropriately named) Video menu, and it turned out that my audio input was set to a screen-sharing program I no longer use. I changed it to the microphone that was set as my system default, and suddenly my voice rang out.
The ridiculous FaceTime Video menu, which contains audio settings for FaceTime and Phone (undocumented), is so long I’m showing it sideways.
“Ah ha!” I thought to myself—and said to Jeff and to editor Jason Snell—”I am a very clever chap and should document this as an article for Six Colors,” which Jason agreed to. In editing the article, Jason said, “Isn’t there a Video menu in the Phone app?” Despite my recollection that there was none, and my recent checking for such a menu, there it was on my Mac. It must have been there all along, and, Westworld-like, it didn’t look like anything to me.
In my defense: Why, why, is there a Video menu in an app called Phone that doesn’t use video?3 Apple doesn’t document this in the Phone part of its Tahoe manual And I wasn’t the only one unaware of it: Jason and Jeff didn’t know it existed either, until Jason was pushed past the Somebody Else’s Problem Field level of awareness.
I guess this is how I keep humble. Despite decades of using a Mac, I can still miss a Video menu in an audio app.
No, I went back and checked 26.0 release videos and screen captures. It was there. ↩
Let us not even consider the possibility of using speakerphone mode, despite how well the iPhone handles audio input and noise cancellation. ↩
You can launch FaceTime Video calls from the Phone app, but the Phone app has no video features. ↩
When you open Transit, you’re presented with a list of nearby routes. You can pin your favorites to the top (left). A map view and handy slider let you know how far your bus is from the stop you’ve selected. (middle) When your stop is near, Transit zooms in, and gives you time to signal that you’d like to get off.
I’ve been a fan of the Transit app for a long time. Apple and Google Maps can provide similar information about how and when to catch a bus or a train, but Transit has always focused more tightly on those modes, with lots of real-time data, and a social component, if you’re into that sort of thing.
Tap on a route you’re interested in to see how long you’ll need to wait for the next bus or train. Scroll right to see more departures. You can also select your destination stop, and use the Go button to plot your trip.
It had been a long time since I’ve used Transit, so updates haven’t been on my radar until I found myself in Northern California recently. I needed to use BART, the AC Transit bus system, and San Francisco’s MUNI, all in the course of a weeklong trip. And when I opened Transit, I discovered that a lot of things about using the app have gotten better with the release of version 6.0 last year.
Transit has always been best as a “live” app, the kind you want by your side when you need to know if you’ve missed the bus, or how long it will be before the next one comes. It works great for route-planning, too, but so do the “big two” mapping apps. Transit also excels when you’re on a train or bus, watching for a stop.
What’s new are the big, bright boxes that tell you how long you need to wait for your transit vehicle, offered with real-time data, when available. It’s also easier to scroll a list of stops your vehicle will make, because the list is bigger and bolder onscreen. As before, you can use the Go feature to plot your route, live, and have Transit tell you where and when to get off, with any combination of phone and Apple Watch notifications.
If you’re planning a trip, Transit offers a lot of preferences you can adjust, whether it’s limiting the amount of walking you need to do, or getting there quickly, whatever the mix of modes. These have been beefed up, but they’re a little hard to find for the beginner.
There’s always been a social component to Transit, from usage badges to aggregated data that gives the app more information about the routes its users frequent. In the most recent version, there are also poll questions, meant to gather information about vehicles, stops and safety. Multiple choice questions pop up when you’re on a bus or train, and it’s easy to either ignore them or participate. If you answer two or three multiple-choice questions, Transit will ask if it can send you more, or if you’d rather not. It might be annoying to some riders, but it’s a way to pass the time while you ride, and the questions are all on the app’s screen, not pushed to your phone as notifications… Which feels like a nice balance.
Apple lets you store payment cards in two places on your devices. Apple Pay is for point-of-sale (POS) transactions at a payment terminal or the like, as well as for payments in apps or in Safari. AutoFill Cards, the name of the menu items, lets you, er, automatically fill cards within Safari, and in any browser that supports Apple’s autofill framework.
Why does Apple have two places to store credit and debit cards? Why is the way you set them up similar, but not identical? When can you use one and not the other?
I thought I knew the full answer prior to researching my new book, Take Control of Wallet, but it took a lot of careful reading and testing to understand what I had been missing.
(Yes, I wrote an entire book about Wallet. You may laugh! But Take Control Books publisher Joe Kissell and I agreed on it after I realized how many pain points I kept finding as an ostensibly experienced user. Wallet has a lot of unexplored territory for many people—including me!)
Yes, Apple can still learn from Google and Samsung. (Shutterstock)
For the past decade and change, I’ve tested and reviewed a large number of Android phones, augmenting the iPhone expertise I’ve built up since seeing the very first iPhone live and in person at Macworld Expo 2007. And while that parade of phones has included some winning devices, my overall impression of the Android experience falls somewhere along the lines of “How do people live like this?”
That’s largely a reflection of the haphazard way Android phones receive their software updates. Some, like Google’s own Pixel devices, get new features right away, while others see updates once phone makers and wireless carriers are good and ready to release them. As someone used to downloading iOS updates the moment they’re available, that throws me. Android partisans tell me I’m being silly — sometimes politely, sometimes less so — and they may well have a point.
But even if some elements of the Android experience don’t land with me, I’d have to be a pretty narrow-minded person not to appreciate the features that do deliver. Android phones get a lot of things right — and some of those are missing in action when it comes to the iPhone.
Look, Apple didn’t sell more than $201 billion in iPhones during its 2024 fiscal year by listening to my advice, and I certainly don’t expect the company to start casting sideways glances toward Android phone makers to surreptitiously gather ideas on how to spruce up the next iPhones. But I do think there’s some merit in looking at areas where Android phones excel and how adopting something similar might give the iPhone a boost.
After all, we have some evidence that this already happens to some degree. Google added a rather distinctive horizontal camera bar to the Pixel 6 back in 2021. And while I don’t think the iPhone 17 Pro’s extended camera array is a direct copy, it certainly seems to draw some inspiration from what the Pixel has offered for years. Bringing the new look to the iPhone also allowed Apple to shift around internal components so that the current iPhone can benefit from a bigger battery and a vapor chamber, so there are benefits to adopting, adapting and improving.
So here’s what I’d flag up from my time looking at Android phones for features I’d like to see on the iPhone.
Despite some bumps in the road for its AI-driven features on the consumer side, Apple’s not slowing down on integrating the technology into its products. Today the company announced the latest update to its Xcode developer tool, which brings support for agentic AI coding.
Adding the agentic model opens up features like the ability for these AI models to get even deeper access to and more power with your projects. For example, the agents can look at and parse your project’s file structure to get more information, or even test and build the project all by itself. They’ll also have access to the latest documentation, allowing them to take advantage of the most recent APIs. Perhaps most impressively, these tools continue to iterate, repeatedly testing, verifying, and fixing errors until the project builds successfully.
This feature builds on top of the existing intelligence-powered tools and integrations that Apple introduced in Xcode 26 last year. Out of the box, Xcode 26.3 has built-in support for Claude Agent and ChatGPT’s Codex, allowing users to log in with their accounts or API tokens. But because this system is underpinned by Model Context Protocol (MCP), any other agent that supports the open standard can be integrated as well.
Xcode 26.3’s release candidate is available for download today.
[Dan Moren is the East Coast Bureau Chief of Six Colors, as well as an author, podcaster, and two-time Jeopardy! champion. You can find him on Mastodon at @dmoren@zeppelin.flights or reach him by email at dan@sixcolors.com. His latest novel, the sci-fi spy thriller The Armageddon Protocol, is out now.]
It’s Super Bowl week and the start of the Olympics, so Will Carroll joins Jason to discuss Peacock’s almost-make-or-break moment, streaming fights and wrestling, and the fate of a clutch of Regional Sports Networks and other cable channels.
Six Colors subscriber Mihir writes in with a Photos question:
How do I delete just the RAW file in a RAW+JPEG pair from my photos library on my iPad or my iPhone?
The short answer: you can’t. Not directly, anyway. And it’s not just an iOS or iPadOS limitation—macOS won’t let you do it either.
I can understand why Mihir asks. An image in RAW format can occupy several times the amount of storage as a JPEG equivalent. This has to do with the nature of the image being stored, as I explain below.
There are good reasons to capture as RAW and good reasons to discard those formats later. I’ll go through the background of RAW, and then provide a workaround to Apple’s missing piece.
Pre-post-processed camera sensor data
The RAW format used by digital cameras is often capitalized as RAW, but it’s not an acronym, nor is it a format in the traditional sense.
RAW means the file contains “raw,” or unprocessed, sensor data from your camera. To produce a JPEG, TIFF, or other format, a digital camera—including your iPhone—performs post-processing to produce an image that’s immediately usable. This can involve making significant changes to dynamic range and white balance, or even combining multiple images as a form of computational photography, as Apple does with iPhone photos.
RAW+JPEG is a common way to get a high-resolution processed version and the editable original sensor data in a single package in Photos.
This makes RAW the digital equivalent of a film negative: it’s typically larger than a post-processed file, and contains information that hasn’t yet been shaved down or squeezed into a presentable output. This gives you more flexibility when editing, but it requires processing to be usable for design, printing, or sharing.
Many cameras let you set a RAW export that includes a JPEG preview usable on its own. The JPEG is the best post-processed output from the RAW, and was originally provided because desktop (and later mobile) software didn’t support RAW or didn’t always keep up to date. Without the JPEG, importing the RAW file by itself would have been much less useful.
There’s no single RAW standard—Canon, Nikon, Sony, and others each have their own proprietary versions. It has become common to write “raw” in all caps, probably to distinguish it from the adjective form.
Because the information comes more or less directly from sensors without intermediate steps, it contains much more data that appears like noise, as the variation between adjacent sensors is retained rather than smoothed away. So even for RAW formats that compress data—not all do—the files will be larger than final images intended for viewing or printing. RAW will always be much larger than a corresponding JPEG file, as JPEG is lossy by nature, and discards some information even when you’re using the maximum setting.
After camera makers began supplying RAW output, often requiring apps they released to support it, photo-management and image-editing tools added RAW processing filters to meet the needs of digital photographers. Every professional app supports importing RAW, including Photoshop, Lightroom, Pixelmator Pro, Capture One, and DxO PhotoLab. And Photos!
Image-editing apps generally treat RAW as an import format: you view a preview, then apply changes before it is imported into an editing environment where you can work on the resulting image. Photo-management apps with built-in editing tools, like Photos and Lightroom, typically retain the original RAW image, and allow you to apply modifications on top. This provides much more flexibility in achieving your desired outcome.
One image unit, indivisible
When you import RAW+JPEG pairs into Photos, Apple treats them as a single, indivisible unit. You can choose which version to use as the basis for editing (Image: Use RAW as Original or Use JPEG as Original), but you cannot discard one half of the pair while keeping the other. Delete the image, and both files are thrown away.
Apple built Photos around a lossless workflow. This means that the original file that’s imported isn’t modified—changes are layered on top and previewed, and can be reverted back to the source image. You’d think it might engineer an override in a case like this, but apparently not.
If you need to reclaim the storage space those RAW files occupy, it’s only possible on a Mac, and it requires exporting, deleting, and re-importing.
Follow these steps if you haven’t made any modifications that you want to keep for any or all of your RAW+JPEG pairs:
Make sure to check Export IPTC as XMP to create a sidebar file with metadata you’ve added to an image.
Select the images you want to retain in JPEG format.
Choose File: Export: Export Unmodified Originals. In the export dialog, enable IPTC as XMP—this creates a sidecar file containing your metadata (titles, keywords, locations, descriptions, etc.). Without that, you’ll lose any metadata you added.
Choose a destination and click Export.
In the resulting folder, each RAW+JPEG file is represented by three files: the RAW file, the JPEG image, and an XMP sidecar.
Delete the RAW file or files from that folder.1 (If you don’t, Photos treats the two as a pair and merges them when re-importing.)
Back in Photos, delete the original RAW+JPEG pairs. These are moved to the Recently Deleted album (see below).
Reimport the folder containing just the JPEG and XMP sidecar. Photos will apply metadata from the sidecar file automatically.
Delete the folder to free up space.
The export files are split into three parts. You discard the RAW files before re-importing.
Of course, you can use the same process to jettison the JPEG and retain the RAW-formatted file.
When you delete files, if you’re sure that you have all the backups you need, you can click the Recently Deleted album in the Photos sidebar, authenticate if prompted, and click Delete All. (Or select images and click Delete X Items.) This removes the images from your Mac, iCloud Photos, and all linked devices immediately and forever. Use wisely!
Now, I noted above that this works for images that you haven’t modified in Photos. As part of its lossless workflow, exporting unmodified originals means you lose any changes unless you follow these steps:
Before step 5 above, return to each modified image in Photos.
The re-imported JPEG image should appear next to the RAW+JPEG file, because of the timestamp, which is preserved from the XMP data. Select the RAW+JPEG file, and press Command-Shift-C (Image: Copy Edits). This copies any modifications.
Now select the re-imported JPEG, and press Command-Shift-V (Image: Paste Edits).2 This applies those changes.
Proceed to delete the original RAW+JPEG file.
Because of how iCloud Photos syncs images, you may want to delete all the images you intended to first, and make sure those images have moved to the Recently Deleted folder on your devices before you re-import them.
While Mihir specifically asked about iOS and iPadOS, the export-delete-reimport workflow requires the Finder and file management capabilities that only macOS provides.
For more expert advice on Photos, you should obtain a copy of Jason Snell’s Take Control of Photos, which addresses all of the app’s features and vagaries.
A feature request, not a bug
People have been asking Apple to add a “split RAW+JPEG pair” or “delete RAW only” feature for many years, and the company hasn’t budged, likely because of its focus on lossless workflows.
In the meantime, if you find storing RAW+JPEG is taking you too close to a full volume, you could shoot RAW only on your camera, and let Photos generate a JPEG preview. If you want to convert to JPEG, you can export it from the RAW file and re-import it. Or you might switch between RAW+JPEG, RAW, and JPEG shooting profiles on your camera, as many support user-defined modes that include output formats.
[Got a question for the column? You can email glenn@sixcolors.com or use/glennin our subscriber-only Discord community.]
While camera makers use several different RAW file extensions, these files should appear as “raw” under the Kind column in the Finder. If not, common extensions include .cr2, .cr3, .nef, .raf, .arw, and .dng. Failing that, look for any file that doesn’t end with .jpg/.jpeg or .xmp. ↩
You may have Command-Shift-V set for another shortcut. I use it with PasteBot. In which case, you can use the Edit: Paste Edits menu item or create a distinct shortcut for it. ↩
We break down Apple’s latest financial results (including the potential supply-chain storm brewing on the horizon) and then discuss the difficult roll-out of Apple’s new Creator Studio bundle.
Tim Cook enjoys a night out, Apple ships new AirTags, and other things happened.
The real “Triumph of the Will” was sitting through the whole movie
In his continuing audition for the role as poster child for a public campaign against tone deafness, Tim Cook attended the premier of a hagiography of the First Lady to hobnob with the president on the same day ICE agents murdered a protester in Minneapolis.
Remember a decade ago, when our biggest concern about Apple was that it made crappy laptop keyboards?
Ah, yes, the good old days.
Here in 2026, hardware design seems to be the one thing that the company is doing right. I’ve got an M4 MacBook Air, and it’s honestly hard to find anything wrong with it. That iPhone 17 Pro in orange? A thing of beauty. Not to get ahead of Jason’s annual report card results, but it kind of feels like we’ve kind of run aground after that category.
Software quality? Ehhhh. Social impact? Hoo boy.
Look, I don’t want to be a bummer. I’m dropping the bit here for a hot minute, because it’s hard to crack jokes about Tim Cook being a robot or the dread god Glog-Raggopth waking from his slumber when they both kind of feel a bit too close to home.
To my mind, the company hasn’t been headed in this wrong of a direction since they started driving with Apple Maps.…
With over 5,000 five star reviews; Magic Lasso Adblock is simply the best Safari ad blocker for your iPhone, iPad and Mac.
And with the new App Ad Blocking feature in v5.0, it extends the powerful Safari and YouTube ad blocking protection to all apps including news apps, social media, games, and other browsers like Chrome and Firefox.
As was foretold (in last quarter’s corporate guidance), on Thursday Apple reported its biggest quarter ever. The holiday quarters are always Apple’s biggest, and this was no exception. It offered the most revenue ($143.8B) and most iPhone revenue ($85.3B) of any financial quarter in Apple’s history.
Suffice it to say that the iPhone 17 family is a hit.
“This is the strongest iPhone lineup we’ve ever had, and by far the most popular,” Apple CEO Tim Cook said during his conference call with analysts. As for the quarter itself? “It exceeded our expectations, to say the least.” Spoken like a man whose most popular product, the one vital to his company’s existence, grew 23% from the year-ago quarter.
Even more interesting, though, is Apple’s suggestion that it’s still selling the iPhone 17 about as fast as it can make them—or to be more specific, about as fast as TSMC can make cutting-edge 3nm chips to power them, per Cook:
We exited the December quarter with very lean channel inventory due to that staggering level of demand, and based on that, we’re in a supply chase mode to meet the very high levels of customer demand. We are currently constrained, and at this point, it’s difficult to predict when supply and demand will balance. The constraints that we have are driven by the availability of the advanced nodes that our SOCs are produced on, and at this time, we’re seeing less flexibility in the supply chain than normal, partly because of our increased demand that I just spoke about.
Those details are really interesting. Back during the height of the pandemic, sales were constrained because Apple lacked access to “legacy nodes”—chips made on older processes for stuff like Wi-Fi and Bluetooth. That is definitely not the case now, when it’s the “advanced nodes” of 3nm chips at TSMC that are just not being built fast enough because demand was much higher than Apple expected.
This also extends a long-standing story that the Chinese market really likes a new-looking iPhone. Overall, Apple’s revenue was up 38% in China. Cook said that traffic in Chinese Apple Stores grew by “strong double digits,” and cited surveys that said the iPad was the top-selling tablet in urban China and the MacBook Air and Mac mini were the top-selling laptop and desktop in the last quarter in urban China. Cook, a longtime proponent of Apple’s business in China, seems thrilled.
Department of the Tough Compare
Mac revenue was down 7% in the quarter, the poorest performance of all Apple’s categories. But it’s hard to be that down about the results, because not only did the Mac still generate $8.4B in revenue and reach an all-time high in its overall installed base, but this was all happening in a quarter that is the proverbial “tough compare”—since Apple released the M4 MacBook Pro, Mac mini, and iMac in the year-ago quarter, and only the low-end M5 MacBook Pro in this quarter.
Full credit to analyst Michael Ng of Goldman Sachs for the most creative way possible of trying to get Apple to reveal its future product strategies: Ng asked Apple CFO Kevan Parekh if there would be any tough comparisons due in the upcoming quarter, to which Parekh replied, “There’s nothing that rises to that kind of color that we’d outline in the outlook.”
Let me translate this for you: Ng is wondering if, perhaps, Apple is going to release some nice new Macs this quarter that will mean that it’s not a “tough compare” versus Q2 of 2025. Parekh replied by essentially pointing at his previous statement and saying that the dog did not, in fact, bark.
Look, we know there will be new MacBook Pros eventually, and probably pretty soon. Maybe they’ll help with Q2 Mac sales, though at this point they’d only be able to contribute for about half of the quarter. Still, a gold star to Ng for trying to logic his way into getting Parekh to reveal things about future product releases.
The storm clouds of financial headwinds… are called off
Apple posted a company gross margin of 48.2%, based on a 40.7% products margin and an astounding 76.5% margin on services. This was actually above the high end of Apple’s previous guidance on margin. This fact left several analysts on the call flabbergasted, none more so than Ben Reitzes of Mellius:
You know, I’m pretty shocked. I got to hand it to you, Tim, that you’re able to do 48% to 49%. What’s really going on there? How are you doing that with… the [memory] prices?
Parekh’s answer was basically that Apple tended to sell more of its high-margin products than its lower-margin ones during the quarter, which pushed margin up. What really impressed the analysts was his insistence that even this upcoming quarter, where memory price issues are expected to become even more serious, Apple says it feels “pretty good” about its guidance to another 48% to 49% margin quarter.
Looking more broadly at Apple’s forecast, the company says the second quarter should offer 13% to 16% growth versus the year-ago quarter. Considering that the Q2 2025 revenue number was $95.4B, this means Apple expects to generate somewhere between $108B and $111B in revenue next quarter. That’s just a staggering number, because it suggests that even Apple’s boring quarters are going to routinely generate more than $100B in revenue. (For the record, last year’s fourth quarter was the first non-holiday quarter with more than $100B in Apple revenue. This would be the second. There may be no going back.)
Odds and ends
A few other notes about the numbers and call before I wrap it up:
The iPad, bolstered by the A16 base iPad and the M5 iPad Pro, was up 6%.
Wearables was down 2%, marking 10 straight quarters of year-over-year decline. (I suspect the softness in this category is why Apple is reportedly planning on launching several new home-based products, including a screen-based controller and a security camera.) However, here’s an interesting tidbit: Apple said it couldn’t make AirPods Pro 3 fast enough to meet demand, and that it believes the category would have grown had it not been for that supply constraint.
Forget profits and revenues. “This quarter set an all-time record for operating cash flow, coming in at $53.9 billion,” Parekh reported. Accounting nerds, this is your stand up and cheer moment. The cash must flow!
Everyone wants to know more about Apple’s AI deal with Google, but Apple’s not talking. “We aren’t going to provide any details on our arrangement and collaboration with Google,” Parekh said. Cook emphasized that it should be thought of “as a collaboration,” rather than Google just riding in and saving Apple’s bacon. When Ben Reitzes of Mellius tried to get more out of Cook, only to be stonewalled, he replied: “Bummer. Okay, I tried,” he said. “You did,” the CEO replied through a squall of laughter.
Time for a somewhat vertiginous push through a row of poodles.
I got a chance to watch Apple’s new “Top Dogs” immersive documentary this week before its release Friday. It’s about 30 minutes total split into two 15-minute episodes, and takes you behind the scenes (and out on the main floor) at the world-famous Crufts dog show.
It’s a pretty good example of all the issue that creators of immersive video are still working out. There are some amazing moments in “Top Dogs,” mostly when you’re watching a dog and their handler close up, or when you’re in the arena in Birmingham, England, watching the dog show. Unfortunately, there are also a bunch of pretty shaky moments: distracting quick cuts, some vertigo-generating dramatic camera moves, and a reliance (albeit understandable) on non-immersive footage in order to make the narrative make sense despite the lack of the right immersive camera angle.
The more I watch immersive content, the more I realize that it requires patience to help immerse you in the scene. “Top Dogs” lacks patience, even when it pads the main dog-show narrative with side quests to Flyball and agility competitions. I found myself wanting to watch Flyball or agility for a while, just to understand how it worked, but the documentary isn’t really interested in lingering on anything.
So, does “Top Dogs” have some fun fluffy dog action? Yes! I enjoyed watching some remarkable speciments of various dog breeds, even if there was not a single Boxer in sight. But as an immersive project, I found it more representative of a style that’s probably not the right way forward for this style of video.
Every quarter after releasing financial results, Apple CEO Tim Cook and CFO Kevan Parekh hop on a conference call with analysts to detail the quarter gone by, give a peek at what’s to come, and maybe brag a little about setting an all-time record or two. This is Six Colors’s transcript of the call for January 29, 2026.
Company revenue was up 16% versus the year-ago quarter. iPhone and Services revenue also set all-time records. China growth was up 38% after four years of flat-to-down growth. iPad revenue was up 6% while Mac revenue was down 7% in a quiet quarter.
At 2pm Pacific/5pm Eastern, Apple will spend an hour on the phone with financial industry analysts. We’ll have our usual live transcript, followed at 5pm Pacific/8pm Eastern by our own live analysis on YouTube:
And now, to help you visualize what Apple just announced, here is our traditional barrage of charts and graphs:
Apple confirmed to Reuters today that it has acquired an Israeli startup called Q.ai, which uses artificial intelligence technology to analyze audio:
Apple did not disclose terms of the deal or what Q.ai’s technology will be used for, but said the startup has worked on new applications of machine learning to help devices understand whispered speech and to enhance audio in challenging environments. In a statement, [CEO Aviad] Maizels said “joining Apple opens extraordinary possibilities for pushing boundaries and realizing the full potential of what we’ve created, and we’re thrilled to bring these experiences to people everywhere.”
Maizels was also the founder of PrimeSense, which Apple acquired back in 2013 and used as part of the basis for Face ID. Two of Q.ai’s co-founders will be joining the company as well. Though Apple didn’t confirm the price tag, the Financial Times has reported (paywalled) that the deal was worth almost $2 billion, which would make it the company’s second biggest acquisition after the 2014 purchase of Beats for $3 billion.1
There were lots of rumors in the last year that Apple might purchase an AI company to offset the challenges it’s had in the market; in the end, Apple opted to partner with Google to provide foundation models for its technology.
Overall, this feels more like a traditional Apple acquisition: a smaller company, more targeted in its use case, with talented staff that it can bring onboard. Apple’s already done plenty with machine learning around audio (including the translation features of the AirPods Pro and different audio modes in videos shot on the iPhone), and this would set them up well for improving everything from microphone performance to perhaps some of its live captioning features—perhaps in a future smart glasses product, for example.
The icons are new, as are the marketing phrases stuffed into the app names.
I gave some first impressions on Apple’s new Creator Studio bundle earlier this week, but one thing you don’t get to see when things are under embargo is how it all rolls out to the general public, which it did on Wednesday.
What strikes me most about it is how even Apple is stuck with the App Store and its limitations. Developers are quite familiar with how limited Apple’s back-end systems are and how they can inflict frustration on developers and customers alike. But it’s another level when the same thing happens to Apple and its own apps.
For example, yesterday the old versions of Numbers, Pages, and Keynote were updated, apparently with the only new feature being a dialog box that appears when you launch it that says “Use the New Version of Numbers,” with a button to Open New Version.
Why? Seems like Apple has chosen this moment to unify the iPad and Mac versions of Numbers and its fellow apps in a single entity in the App Store, and that leaves the old versions high and dry. The right thing to do here would be to gracefully migrate that Mac app and merge it with the other apps, but apparently not even Apple can convince itself to prioritize a feature that would make the launch of its new suite a little less clunky.
I’m also struck by the fact that Apple has had to do the App Store trick of attaching subtitles to the names of every app it makes, because the design of the App Store has led to stuffing keywords into titles becoming somehow a best practice. So it’s not Final Cut Pro anymore, it’s “Final Cut Pro: Create Video.” And Numbers is “Numbers: Make Spreadsheets.”
Also confusing: Double apps! Right now, I see two versions of Final Cut Pro in the Mac App Store. They’re actually the same app, but one is the original app that someone might have purchased in the past, and the other is a bundled edition that is tied to the iPad version and available by subscription. They have different icons, but otherwise seem to be identical, version 12.0.
I’ve complained a bunch about Apple converting its free iWork apps into freemium apps with paid upsells (to a bundle that’s a bad fit for many users), but that was before I saw that if you don’t subscribe to the bundle, the new versions of Pages, Numbers, and Keynote include ads attached to interface elements promoting the Creator Studio. The first-launch screen and two prominent menu items under the application menu push the Creator Studio. There’s also a prominent toolbar button for the suite-included Content Hub so that you can browse all that great premium clip media—but if you insert it, it’s watermarked, because you haven’t subscribed.
Making a new file? You’ll see an upsell for Creator Studio.
Finally, a word about the suite’s new app icons, which I specifically did not address in my review. It’s very easy and fun to complain about icons because art is entirely subjective. You can like what you like. I will say this: The early days of OS X icons were a reaction to the incredibly limited icon palette of classic Mac OS, so apps often offered incredibly detailed, photorealistic icons to represent themselves. It felt so modern. And many of those icons were beautiful.
But let’s forget about art for a moment and consider utility. Where do we interact with icons? For me, it’s the home screen on my iPad and iPhone, and the Dock and Spotlight launcher on my Mac. In every context, these icons are very small, far too small for a gorgeous skeuomorphic icon.
Clearly, the brief given to the designers of the new Creator Studio icon set was to make them all differentiated by color and shape. While I don’t love a lot of the choices they made—there are a lot of metaphors that seem to have drifted so far from reality that they make no sense anymore—I have to admit that they are all different silhouettes and colors, which means I know that the green bar chart that looks like it’s giving me the finger is Numbers when it’s in my Dock.
Should Apple aspire to better than utility, even when it comes to something as far away from mission-critical as app icons? Yes, it should. And I don’t think the people who are complaining about app icons are really complaining about app icons. They’re pointing out a symptom of a larger disease, which is Apple losing its way when it comes to usability and software design.
But if you’ll forgive me, I find it hard to get too worked up about icon designs when Apple is putting ads for a professional creative suite in its free productivity apps. Which is the greater offense to the user experience?
The original Mac arrived 42 years ago. Draw a line from that event to this week, in which credible reports suggest that Apple is finally getting close to fulfilling many of the promises it made back in 2024 regarding adding intelligent agents to its devices. Sure, it took licensing Google Gemini to get it done, but we might be on the precipice of Apple Intelligence being what Apple said it might be almost two years ago.
The more I think about it, the more I think that Apple Intelligence might actually be the latest attempt by Apple to fulfill the dream behind the original Mac. In an era where our devices are impossibly powerful and often frustratingly complicated, maybe what we need is A Computer for the Rest of Us again.