Six Colors
Six Colors

Apple, technology, and other stuff

This Week's Sponsor

Unite Pro: Turn websites into Mac apps with native enhancements

By Will Carroll

Vision Pro and Cosm: Two of a kind?

Basketball game streaming live in a Cosm.
Public spaces like Cosm might be a good content fit with Vision Pro.

The Apple Vision Pro feels like a product that’s waiting for the world to catch up, but the reality is closer to the opposite. The world is waiting for a reason to use it and that reason hasn’t quite shown up yet.

There’s very little wrong with the hardware. Apple built something that works in a way first-generation devices rarely do (says the guy old enough to have bought a Newton at launch) with displays that feel natural rather than novel and an interface that disappears quickly enough to let you focus on what you’re seeing.

The problem comes the moment you take it off. There isn’t a strong pull to put it back on. It’s impressive, even remarkable in bursts, but it doesn’t yet fit into a daily rhythm. That’s not a hardware problem. It’s a content problem, and more specifically, a cadence problem. Apple has treated immersive content like a prestige release schedule, carefully curated and spaced out, which works for television but not for behavior. If you want people to build a habit around something, you need volume and consistency, not occasional brilliance. Right now, Vision Pro feels like something you check in on rather than something you live inside, and that distinction matters more than anything on the spec sheet.

Neal Stephenson’s skepticism lands because it recognizes that gap. If the content never reaches a point where it becomes necessary, the headset remains optional, and optional devices rarely scale. What’s interesting is that the missing piece isn’t hypothetical. It already exists in a different form, outside of Apple’s ecosystem, and it’s showing up in a place that Apple understands better than most companies: people paying for experience.

Cosm is the cleanest example of that. It’s easy to dismiss it a high-end gimmick, a giant dome with a better screen, but that misses what’s actually happening inside those venues. People are buying tickets, planning nights around it, treating it as something closer to attending a game than watching one. The technology matters, but the behavior matters more.

Cosm is already generating meaningful revenue and drawing repeat customers, which tells you this isn’t just novelty value. It’s tapping into something real, the idea that proximity, or at least the feeling of it, has value even when the event is happening somewhere else.

The challenge for Cosm is that scaling that experience is difficult. These are expensive builds that require the right locations, the right partnerships, and enough capital to expand without diluting the quality that makes them work in the first place.

That is exactly the kind of problem Apple has solved before. It’s not just about having the cash, though Apple certainly has that. It’s about having the discipline to build a system that can expand without losing its identity and the distribution to make it visible at scale. If Apple owned something like Cosm, it wouldn’t just be a set of venues. It would be a front door. You could put an Apple Store in the lobby and it wouldn’t feel forced. It would feel like a natural extension of the experience, a place where people encounter the hardware in the context of something they already understand.

From there, the path to the home becomes clearer. Vision Pro, or whatever lower-cost version follows, doesn’t need to stand on its own as a category. It becomes an extension of something people have already bought into. The idea of watching a game “from somewhere else” is no longer abstract because they’ve already felt it in a room with other people. At home, it becomes a different version of the same experience, missing the crowd and the waiter, but gaining convenience and access.

The critical shift is in how Apple approaches rights. Trying to own sports outright is a losing strategy. The costs are too high, the competition too entrenched, and the fragmentation too deep. Apple has made smart moves with MLS, F1, and selective partnerships, but doubling down on exclusivity won’t unlock this. The better path is to work alongside the existing ecosystem. Install Cosm camera systems at major events, not as replacements for the broadcast but as an additional layer. Let networks and leagues sell that immersive feed as a premium product, with Apple taking a share for the technology and distribution. It’s additive rather than competitive, which makes it easier to scale and harder for partners to resist.

Apple has always been at its best when it connects behavior to technology in a way that feels inevitable in hindsight. Right now, Vision Pro still feels like a solution looking for a problem. The problem, or more accurately the opportunity, is already there in how people respond to immersive sports experiences. Cosm has shown that people will pay for that feeling. The hardware is close enough to deliver it at home. The gap is building the bridge between those two things in a way that feels continuous rather than experimental.

If Apple gets that right, the conversation around Vision Pro changes quickly. It stops being about whether people want to wear a headset and starts being about what they’re missing when they don’t. That’s the point where adoption tends to take care of itself.

[Will Carroll was an early writer at Baseball Prospectus who covers injuries at Under the Knife and talks about them on Injury Territory. He frequently co-hosts Downstream with Jason Snell on Relay.]


by Jason Snell

The earliest days of Apple

Harry McCracken has put together an amazing oral history of Apple’s earliest days. You should read the whole thing, but this anecdote from Chris Espinosa, who still works at Apple after all these years, is the part that made me laugh the most:

I was sitting there in the Byte Shop in Palo Alto on an Apple-1 writing BASIC programs, and this guy with a scraggly beard and no shoes came in and looked at me and conducted what I later understood to be the standard interview, which was “Who are you?” I said, “I’m Chris.” … Steve Jobs’s idea back then of recruiting was to grab a random-ass 14-year-old off the streets.

The rest is history!


By Jason Snell

“For All Mankind” returns with more Mars drama

Mireille Enos in “For All Mankind.”

The fifth season of Apple TV’s “For All Mankind” premieres March 27—really, the evening of March 26 for those of us on the West Coast. For the last few years, Dan and I have been reviewing episodes on our “NASA Vending Machine” podcast and I’m excited to have the show back.

As always, “For All Mankind” is about taking big swings. There’s always a dramatic, history-changing moment or shocking twist that’s not too far away. Set in an alternative past where the Space Race kept going after the Soviets landed on the moon (yep!), season four took us to a 2003 where Mars colonists sought more autonomy by hijacking an asteroid.

This season, which takes place in 2012, is still primarily set on Mars, though there’s also some space adventure in the offing. Apple tech fans will enjoy that we’ve finally reached the iPhone era, though the iPhones on “For All Mankind” are a little thicker than the ones we remember, and they might actually be Newtons. There are also a lot of early-2010s iMacs on display.

While the first episode has to do a lot of work reminding you of what’s happened recently and setting up the new power dynamics at play this season, subsequent episodes get pretty intense, pretty fast. At times the show plays with police procedural, mystery story, even car-chase adventure… familiar TV genre stuff, except it’s all on Mars! Mireille Enos of “The Killing” plays an important new role as an investigator for the Mars Peacekeeping force who is suspicious that several different crimes might have been committed out on the surface. There are also a bunch of returning faces, some expected and some quite surprising. (And also, yes, Joel Kinnaman is still in the show even though Ed is now basically in his eighties.)

I’ve seen the first six episodes thus far, so I don’t know where it’s all going, but I’ve sure enjoyed the ride. “For All Mankind” continues to use its alt-history setting to tell dramatic, almost operatic stories that can also disturbingly have relevance to current events in our own world.


By Joe Rosensteel

How can Siri automate Shortcuts when it’s so opaque?

Screenshot of Python code editing software with image scaling script.
Claude Code takes advantage of a real development environment.

I’m pretty skeptical that Apple’s new Siri-wrapped Gemini will be able to accurately and reliably assist with automation. Gemini will be the foundation to Apple’s foundation models, but there’s no there there. Apple has no well-documented, debuggable, inspectable system to execute automation with, unless you count ancient and inscrutable AppleScript, and you shouldn’t.

Sure, LLM chatbots will spit out code (even AppleScript!) if you ask them to, but it might not work. It gets substantially worse when you’re asking LLMs questions about Shortcuts.

Go ahead and ask any chatbot to describe how to make a Shortcut to perform some automation that you’ve been wanting to do and then try to assemble what it suggests. It’s extremely tedious, prone to user error, and isn’t in any way guaranteed to work even when it’s all put together.

Agents that hook into development environments are much better than a bare chatbot because they can inspect, run, and debug the code they are generating. They aren’t perfect, but if you have an agent like Claude Code hooked up to an development tool like VS Code and start describing some Python script you want, it’ll execute and iterate until the output is what you asked for.

If humans don’t have access to documentation, to actionable debug output, logging, the ability to bypass/ignore actions as part of testing, and the ability to copy and paste snippets of code, then how can the new Siri do it?

Right now, Shortcuts works with AI models by passing some input and then receiving the output. When something goes to the model, the model transforms the data, and delivers a result back to Shortcuts. That’s a non-deterministic workflow, so any change to the model, or even just randomness in general, can produce different output. This means you can’t reliably troubleshoot or adjust it without introducing uncertainty in what new outputs you’ll get.

When working with an agent to assemble automation in an IDE, the code it builds is deterministic, so it will keep working even if the model changes. Not everything you want to automate requires LLM functionality when it runs, but not everything you automate should require hours of labor to fabricate the deterministic workflow version of it.

I really hope that the magic of new Siri isn’t going to be that it will just do things with bare actions and App Intents, magically, without any user-accessible process, or as a blob inside of a Shortcut you need to make. If I ask Siri to reorder a list, and it doesn’t do it correctly, I want to be able to access the scaffolding it created to see what went wrong, not just keep asking Siri to do it again in slightly different ways until I get output I like.

If Siri doesn’t produce anything inspectable, or it produces a Shortcut, then there’s not much work humans or AI can do to fix things.

AI cut below the rest

The problem the Shortcuts app is supposed to solve has never been solved, because no one really knows how to use Shortcuts unless they become a Shortcuts expert. Shortcuts is user-friendly in appearance, but not in practice. It’s meant to welcome people who don’t know anything about programming with its friendly drag-and-drop interface, and searchable actions panel.

Unfortunately, the names for actions don’t always say what they do, and the documentation is often a vague piece of filler that’s frequently reused for more than one Shortcut action. Even experienced programmers can get flummoxed when they try to search the available actions for seemingly standard functions, like reversing a list.

Magic connections are magic, until your script gets any longer than the length of your screen and you need to start dragging actions around, inevitably breaking connections and making unintended ones. With a text-based script you’d have to keep track of the names and spelling of your variables, but they don’t change out from under you if you add more lines of code above or below them.

You can’t do one of the most simple, and useful things in scripting, which is commenting out (ignoring/bypassing) something to test or evaluate alternatives.

A lot of the time, when people are using Shortcuts, they’re relying heavily on the run shell script action to do actual programming that lets them write normal, vanilla code, or ssh’ing into a server from iOS to do the same thing. It’s nice that Shortcuts can do that, but shell scripts aren’t cross platform, and ssh’ing into a server is in no way accomplishing Shortcuts’ mission.

Without logging, you can’t ask Siri why your automation that was supposed to run in the middle of the night didn’t run. Maybe it was a permissions issue that was never raised when the shortcut was created. You, and Siri, just don’t know.

AI rising tide lifts all boats

Again, Apple doesn’t have to do these things just for humans, or just for Siri. They are in no way mutually exclusive.

If the concern is that Shortcuts shouldn’t be like a programming language, with tracebacks, and logs which would put off “normal people” then just remember that “normal people” don’t really use Shortcuts. They ask a chatbot to just do it, and Siri, as Apple’s chatbot, could take advantage of those fiddly, programming bits and perform its role better, in a way that was auditable.

I have seen people make frantic posts on Mastodon about how AI is deskilling programmers, but the beauty of Shortcuts is that Apple already applies the deskilling at the factory.

[Joe Rosensteel is a VFX artist and writer based in Los Angeles.]


Our latest personal tech projects, twenty-five years of macOS, our networking setups, and where we turn for up-to-date information.


Breaking down the government’s bizarre router ban

The Verge’s Sean Hollister with just an excellent article breaking down the administration’s total nonsensical ban on consumer level routers made outside of the country. The article’s structured as a Q&A, and here are just a couple of my favorite excerpts:

Sounds bad. But if they’re not recalling the routers, and they’re not fixing them… what the heck is the government actually doing?

It’s banning future routers that haven’t been made yet.

You’re not making a lot of sense.

I warned you this was a story about Brendan Carr, known dummy and anti-consumer FCC chairperson! Specifically, the FCC is keeping new, previously unannounced, foreign-made consumer routers out of the US… unless it decides to exempt them. For reasons. We’ll get to those.

Hollister classifies this as a shakedown to somehow force more manufacturing jobs back to the U.S. Gee, I wonder if there could possibly be any…let’s say exploitable…loopholes to this brilliantly concocted plan:

What if I buy one of those newer routers in Canada and bring it back home?

The FCC’s magic 8 ball says, “no,” but good luck enforcing that, Brendan.


Moltz goes on and on about games, Dan makes a vicarious purchase and Lex has a new app!


by Six Colors Staff

Apple announces Apple Business, ads in Maps

Apple Newsroom:

Apple today announced Apple Business, a new all-in-one platform that includes key services companies need to effortlessly manage devices, reach more customers, equip team members with essential apps and tools, and get support from experts to run and grow efficiently and securely. Apple Business features built-in mobile device management, helping businesses easily configure employee groups, device settings, security, and apps with Blueprints to quickly get started. In addition, customers can now set up business email, calendar, and directory services with their own domain name for seamless and elevated communication and collaboration.

This new offering actually consolidates three existing Apple products, Apple Business Manager, Apple Business Essentials, and Apple Business Connect, and offers mobile device management for free, which will save some existing customers money. There are also some new API functions for larger organizations, and Apple is offering businesses access to Apple-hosted email and calendaring for the first time. The new Blueprints feature will make it easier for administrators to assign configurations and apps.

Also announced today is something that has been widely expected: ads in Maps in the U.S. and Canada. We now know those will arrive this summer. Apple provides additional details further on:

Ads on Maps will appear when users search in Maps, and can appear at the top of a user’s search results based on relevance, as well as at the top of a new Suggested Places experience in Maps, which will display recommendations based on what’s trending nearby, the user’s recent searches, and more. Ads will be clearly marked to ensure transparency for Maps users.

Again, it’s not a huge surprise to see this—Apple has been working on bolstering its ad business in the past few months. But it does mean that once this feature is enabled, you’ll have to scroll past an ad to see results when you search for stuff in Apple Maps.

Ads or no, companies that use Apple Business will also be able to edit their metadata and upload pictures directly into Apple Maps.

It’ll take some time to digest these changes, but it seems like this is a simplification of Apple’s business offering, and making MDM free will be a win for smaller organizations. Unfortunately, Apple’s still only offering 5GB of free iCloud data on managed accounts, and it’s hard to think that any business should rely on Apple’s notoriously unreliable email platform.


Are orbital data centers economically viable?

From Eric Berger at Ars Technica, the first of a three-part series about orbital data centers. This first part focuses largely on economics, but also touches upon issues of the environment, the obliteration of the night sky, and more. It’s a really fascinating read.

“This is not physically impossible; it’s only a question of whether this is a rational thing to scale up economically,” [engineer Andrew] McCalip said. “The answer is it’s really close. And if you own both sides of the equation, SpaceX and xAI, it’s not a terrible place to be. I wouldn’t bet against Elon.”

Yet betting on Elon also requires a giant leap of faith.

The third part of this series will dive deeper into detailed cost estimates, but in terms of round numbers, the bare-bones cost of deploying 1 million satellites is more than a trillion dollars. SpaceX’s two biggest previous projects to date, the hyper-ambitious Starlink and Starship programs, each required on the order of $10 billion up front. So in terms of scope and cost, orbital data centers are two orders of magnitude larger.

The part that has me curious, but isn’t really addressed in the story, is future-proofing. Companies like Nvidia are kicking out new chips at such a rate that the processors you send in to orbit will almost certainly be outdated by the time they’re operational. Will that be enough to offset the perceived gains? Are we constantly going to be launching new satellites? What happens to the old one? What if the AI bubble bursts? All fertile ground for a near-future sci-fi story, methinks, if not near-future non-fiction.


Myke talks to Jason about his “Jeopardy!” experience, and Jason interviews David Pogue about his book, “Apple: The First 50 Years.”


Apple sets WWDC 26 for week of June 8

Black background with 'WDC26' text; 'C' glows with rainbow halo.

Collect your meager Kalshi and Polymarket winnings, I guess: Apple has officially announced that its 2026 Worldwide Developers Conference will kick off with an in-person event at Apple Park on Monday, June 8.

As in recent years, the event will run for the week, starting with the Keynote and Platforms State of the Union on Monday, followed by video sessions and labs. Videos will be accessible on the Apple Developer app and website, as well as YouTube.

Those who attend the in-person event will be able to watch the Keynote and state of the union, as well as participate in other activities throughout the day. Attendees will be determined by random selection, with invitations sent out by the end of the day on April 2. You must be a member of the Apple Developer Program or Apple Developer Enterprise Program, or a winner of the 2026 Swift Student Challenge to apply. Additionally, 50 “Distinguished Winners” of the challenge will be invited to a three-day experience that includes Monday’s special event.

There are, as usual, lots of eyes on WWDC, where Apple traditionally announces all its big platform updates for the year to come. This year we’re expecting the “27” year updates, but there are big questions marks hovering around some features, like the much-delayed Apple Intelligence offerings. We’ll find out that and more on June 8.


By Glenn Fleishman

Avoid (or prefer) sharing photos from Camera with your group

Glenn Fleishman, art by Shafer Brown

I started seeing pictures in Photos that I was sure I didn’t take recently, but often of locations I knew vaguely. After a few days, I realized the issue: my older child, in college on the East Coast, and currently traveling during spring break, had a Camera setting that caused all their images to blend into a shared family library.

Fortunately, our kid is old enough, wise enough—and communicative enough with us parents—that I didn’t see anything they didn’t want me to see, but it is the kind of thing that could be awkward in some shared groups.

The issue is that the Camera app has a tiny icon that’s easy to tap and activate without realizing it. That button’s activation is then persistent for all subsequent photos you take!

Here’s how to work with this feature intentionally, and deactivate it if you never want to use it—with purpose or not!

Share with those who care

The iCloud Shared Photo Library is an oddball: you can create or belong to one, and one only, and it can be shared with the creator plus five others, who don’t need to be in your Family Sharing group, if you have one.1 The thing that you create is called Shared Library throughout the interface.

Once you’ve created or joined a Shared Library, its availability appears in Photos on your devices with iCloud Photos enabled and, more subtly, in Camera. If you’re viewing both libraries in Photos, images from the Shared Library have the Shared Library two-person icon overlaid in the upper-right corner; videos, for some reason, do not.

Screenshot of portion of Photos for Mac showing the pop-up menu for choosing Personal Library, Shared Library, or Both Libraries.
A pop-up menu appears in Photos for Mac that lets you choose whether to see your Personal Library, Shared Library, or both.

On a Mac, Photos: Settings: Shared Library reveals participants and offers a Shared Library Suggestions checkbox. Enable this if you’d like your Mac to say, “Hey, maybe you should add this image to the Shared Library!” (I didn’t find this particularly useful, and disabled it.)

Go to Settings: Apps: Photos, and you’ll note an extra option on iPhones and iPads: Sharing from Camera. That’s the culprit in my offspring’s openness.

Keep Camera shots your own (or not)

You can tap Sharing from Camera or go directly to Settings > Camera: Shared Library: Sharing from Camera. With Sharing from Camera enabled, you see a yellow icon of two people side-by-side in the upper-left corner of the screen in portrait mode or the lower-left corner in landscape. Tap it to disable directly from Camera. A yellow label appears at the top of the Camera interface to indicate which library is in effect after you tap the button. When Sharing from Camera isn’t turned on, the icon appears with a line through it.

Camera app screenshot shojwing shelf of books with the Shared Library message overlaid at top and highlighted in a red box.
When you activate the two-person icon, the Shared Library label appears (highlighted here with an added red box).

The setting is persistent within Camera, so each time you open Camera, your previous Shared Library choice remains. You can override this via Settings or by tapping the icon again.

If you never want to enable Shared Library by accident or intentionally, disable Sharing from Camera in the Photos: Sharing from Camera settings.2

If you found you put media in the Shared Library and want to return them to your own, you can fix this quite easily:

  • On a Mac, select the items in Photos, and choose Image: Move X Photo(s)/Video(s) to Personal Library. You can also Control/right-click on any item in the selection.
  • On an iPhone or iPad, go to the Library view in Photos, tap Select, and choose one or more items. Tap the More … icon at upper-right, and choose Move to Personal Library.

In a bit of turnabout, a few days after writing this column, I get a text from my youth: “Your photos seem to be going into my account. I think you pressed the share button in the app by accident.” D’oh!

For further reading

Our very own leader, Jason Snell, has a book that covers Shared Library and much more. Pick up a copy of Take Control of Photos to get up to speed on this and other quirky Photos features.

[Got a question for the column? You can email glenn@sixcolors.com or use /glenn in our subscriber-only Discord community.]


  1. Shared Library requires iOS 16.1 or iPadOS 16. or later, or macOS 10.13 Ventura or later. iCloud Photos must be enabled on each participating device. People under 13 can only join (or create) a Shared Library with Family Group members. 
  2. There’s a Share Automatically option that puts photos you take with Camera in the Shared Library whenever participants in the Shared Library are recognized while taking the picture. 

[Glenn Fleishman is a printing and comics historian, Jeopardy champion, and serial Kickstarterer. His latest book, which you can pre-order, is Flong Time, No See. Recent books are Six Centuries of Type & Printing and How Comics Are Made.]


by Jason Snell

Missing the point

Matt Birchler sort of doesn’t like the MacBook Neo:

Because this is a Mac, it can do effectively anything you want to throw at it. I have been editing video on it and doing web and Xcode development with very little issue. Yes, my Xcode builds take longer than they do on my M4 Pro Mac, and rendering video runs slower too, but it can do it. If this is the computer your budget allows, don’t think that you can’t.

However, you’ll notice I’m not including any benchmarks in this review. I don’t think benchmarks properly express the experience of using a computer, particularly on the lower end of products. On the high end, when everything basic is child’s play, benchmarks can be helpful to understand differences at the margins. But one of the charts I’ve been seeing go around since this was announced was Geekbench single-core performance, which showed the Neo as effectively as fast as an M4. Let me tell you, if you’re using that chart to understand the performance of this computer, you are being misled.

You certainly are, if your definition of ‘use’ is Xcode builds and video editing. Matt is not technically wrong when he suggests later in his post that, in workflows requiring lots of RAM and speedy disk access, even the oldest Apple silicon MacBook Air will perform a bit better than the MacBook Neo.

When performance is in the ballpark of a five-year-old computer, you have to consider what this machine costs compared to refurbished models from that era…. I can get a refurbished M2 MacBook Air with 512GB storage and 16GB RAM for $50 less than the maxed out MacBook Neo. That M2 Air gets you significantly better performance, a better screen, a better trackpad, better USB-C ports, MagSafe, and faster SSD speeds. I feel very confident saying that the M2 Air is a meaningfully better computer the Neo.

Again, Matt’s not wrong. If you are a savvy shopper with a very limited budget and need that push beyond what the MacBook Neo can give you, you are probably better off searching for a used or refurbished MacBook Air. Last week on MacBreak Weekly, Christina Warren made a strong case for picking up a refurbished or used M3 or M4 MacBook Air, which would cost slightly more than a MacBook Neo but far outclass it in terms of features.

But this entire conversation misses the most important thing about the MacBook Neo: It is sold in every Apple Store, on Apple’s website, and in every Apple sales channel. Most people won’t think to cruise for a refurbished Air—they will just go down to their local store, or pop onto Amazon, and shop for a computer. That’s why the MacBook Neo is important. It’s available to everyone, everywhere, and Apple will stand behind it as a new product.


by Jason Snell

Dreaming of an ultralight Mac

12-inch MacBook
The 12-inch Retina MacBook, circa 2016.

David Sparks appreciates the MacBook Neo, but he’d like something smaller:

Think about it. Apple has covered the pro market with the MacBook Pro lineup. The Neo is about to cover the mainstream and budget-conscious buyer.

But there’s a gap at the top. A premium ultralight for people who travel constantly, who want the absolute minimum weight and footprint, and who are willing to pay for it. A MacBook that weighs two pounds or less, with a stunning display and all-day battery life. Not a compromise machine. A showcase.

I would question the premise that you can get “the absolute minimum weight” along with “all-day battery life” (depending on how you define that)—but I do not doubt that Apple could create a laptop with M-series performance and good enough battery life, but with an emphasis on compactness.

But is there enough of a market for a fourth class of MacBook? As someone who has known and loved the 12-inch PowerBook, 11-inch MacBook Air, and even the 12-inch MacBook, I am sadly not convinced that this is a big enough segment for Apple to target when the MacBook Air exists.

And here’s the biggest reason I think a smaller laptop may never happen: Over the last decade, everything in macOS has gotten a bit bigger—not just OS elements, but even fundamentals of app design. When I was still using an 11-inch Air, I would often discover apps that couldn’t be resized to fit on my screen. The same happened with the retina MacBook. I’m afraid that the 13-inch display in the MacBook is probably as small as modern macOS and today’s Apple will reasonably go.


By John Moltz

This Week in Apple: I went to the White House, but I didn’t inhale

John Moltz and his conspiracy board. Art by Shafer Brown.

Apple announces new AirPods, Tim does an interview, and it’s about App Store standards, people!

It’s the least they could do

Apparently not big enough to make Apple’s “experience” two weeks ago, this week the company announced AirPods Max 2 with the H2 chip. This updated version comes at the same price and in the same colors but with Adaptive Audio, Conversation Awareness, and Live Translation, along with some other small enhancements.

If that seems underwhelming, it could be worse.

“Harmful chemicals found in popular headphones sold across Europe, study”

But I don’t want harmful chemicals in my earholes! My earholes are right next to where I keep my thinking bits!

Fortunately:

The safest headphones were Apple’s AirPods Pro 2 and JBL’s Tune 720BT.

I look forward to the new AirPods marketing touting them as having “the fewest harmful chemicals of any of the popular brands.”

Plain, simple Tim

Tim Cook (remember him?)…

This is a post limited to Six Colors members.



By Jason Snell

Aqara UWB Smart Lock U400 Review: Beam Me Up

Black smart lock with white keypad on blue door. Keypad displays numbers 1-9, 0, and lock/unlock icons. 'Aqara' logo below keypad.

When it comes to smart locks, the goal is “Star Trek,” right? You should be able to walk up to your door and, swish!, it opens to greet you.

When I was a kid, “Star Trek” doors were amazing technology, but not too many years later, even the run-down supermarket in my hometown had automatic doors that opened when you approached them. Still, the dream lives on for the home. Most homes aren’t plausibly designed to have pocket doors that slide out of the way (and I have to think it wouldn’t be up to code), or even automatically swing open.

Okay, then, a dream downgraded: When it comes to smart locks, the goal is sort of “Star Trek,” in the sense that I’d like my door to unlock itself as I approach.

A few generations into this technology, we’ve bought and reviewed a bunch of smart locks, but the ultimate dream has not been fulfilled. Bluetooth-based locks can sort of do the trick, but since Bluetooth is a non-directional technology, there were all sorts of tricks (use geotagging, wait for the phone in question to leave the area, then enable lock-on-view) to make it happen. And it wasn’t very reliable.

Next came NFC locks, which work remarkably well but require you to press your phone or watch up against the lock. It works—I never carry a key for my house anymore, because my Apple Watch is my key—but it’s not exactly the Star Trek dream.

Finally, the future is here: locks with support for ultra-wideband (UWB) technology have begun to arrive, and since UWB offers precise positioning, it’s the first technology that truly offers the potential to have rock-solid support for walking right up to the door and having it unlock before you begin to reach.

I’m all in on the dream, so I bought the $270 Aqara UWB Smart Lock U400 (Amazon affiliate link) and installed it in my front door as my deadbolt, replacing an older NFC-focused smart lock. It’s been there for a couple of months now, and I’m happy to report that the “Star Trek” dream feels real.

After having owned a bunch of these, I’m struck by how robust Aqara’s lock motor is. It forcefully slides the deadbolt into place, which is helpful since my door can sometimes be slightly misaligned, and a little force from the bolt helps push it into place. (If it fails to mechanically lock, it makes a loud beeping noise to alert you that something’s wrong, which is very important if you’re walking out the door!)

I love the options the Aqara lock provides, too. Some smart locks I’ve used haven’t come with real keys, but Aqara’s does—the keyhole is normally hidden, but just drops down from the bottom of the entry panel. There’s a fingerprint sensor on the panel that can give you the Aqara equivalent of Touch ID, but only if you use the Aqara app itself. I added my print to the lock, but have never actually needed to use it. The lock also supports NFC, like my old lock, so older devices can be tapped against the pad to unlock the door. And yes, the display lights up so you can input a multi-digit code if you like.

But the star of the show is UWB unlocking, which uses your absolute positioning in space to unlock the door only when you approach it from the outside. (There’s a clever setting that lets you set what directions it will auto-unlock from, so that if a portion of your house is in front of your front door, it won’t unlock every time you walk toward the door in your garage.) Most of the time, the door audibly unlocks when I’m maybe a foot or less away from the door. Occasionally, I need to stand at the door for a moment, but rarely longer than a second or two. It worked just as well with my iPhone and with just my Apple Watch. It never unlocks accidentally when I approach it from other directions. It really does just work.

I also appreciate the Aqara lock’s approach to batteries. A not-so-fun fact about smart locks is that they require batteries. My previous smart locks have chewed up AA batteries over the course of a few months. Aqara has instead built a rechargeable battery into the lock that’s basically the size of an iPhone power bank. It’s been months, and my battery is still at 85%, so it is going to last a long time. But beyond that, it’s just an easy USB-C charge to top it back up.

You can either remove the battery and charge it wherever (while your lock stops working), or just plug in a power bank to the lock itself. Aqara cleverly suggests just putting the power bank in a bag and hanging it on the interior door latch while it charges the lock back up, which worked perfectly when I tried it.

As with many other locks, once you’ve got a smart lock attached to HomeKit, there are various automations you can run, though I haven’t ever found any of them to be particularly useful. For me, a good smart lock gives me confidence that the door is locked or will lock itself automatically (and I can check on my phone to confirm this), and opens easily.

I’ve never had a smart lock that opens more easily than the Aqara UWB Smart Lock U400. It’s not “Star Trek,” exactly, but it’s probably as close as I’m ever going to get.


by Jason Snell

That time I got to touch the original iPhone

In the aftermath of an Apple product briefing for the original iPhone at Macworld Expo 2007, here’s what I wrote:

Yes, I’ve touched it.

Although the undisputed winner of the most-talked-about product award at this year’s Macworld Expo is Apple’s new iPhone, it’s actually quite a rare commodity… I don’t have an exact count, but as far as I can tell there aren’t very many real iPhones out there in the world. (And since the iPhone is still six months away from its arrival, that’s not too surprising.) And it’s too bad… let me tell you with personal experience, it’s much more impressive when it’s in your hand—or more to the point, when your finger’s running across its multi-touch screen….

I can admit that I found it quite difficult to form complete sentences while I was holding the iPhone. In terms of sheer gadget magnetism, its power can not be overstated.

I’ve always wondered what my contestant-interview anecdote would be if I were ever a contestant on “Jeopardy!”. This one would make a pretty good one, I think.



Search Six Colors