Six Colors
Six Colors

This Week's Sponsor

Magic Lasso Adblock: YouTube ad blocker for Safari


By Jason Snell

I’ll pin my hopes on AI assistants

Last week, the startup Humane did a marketing blitz for its forthcoming AI Pin, a $699 wearable designed by a bunch of ex-Apple people that has been the subject of a lot of tech-industry buzz lately.

It’s a really interesting product. While it would be easy to focus on the company’s lackluster marketing video that featured actual AI hallucinations, I’m more interested in what this product says and doesn’t say about the current and future of tech.

I don’t think the AI Pin will succeed for numerous reasons, foremost among them being the fact that it seems to be a product designed to make your smartphone unnecessary or ancillary. It feels to me like this is the product’s point of view not because of a deep philosophical reason but because Humane is a company with investors that needs to ship and sell a hardware product and trying to attach to the side of Apple’s or Google’s smartphone operating systems makes this thing an expensive accessory instead of a revolutionary device.

It’s not a point of view that makes sense otherwise, because it seems to posit a world where people just hate their smartphones and can’t wait to be rid of them. This is the world as seen through a funhouse mirror. People love their smartphones. That’s why we’re all staring into them for hours and hours every day! The knock on smartphones is that people use them all the time, and maybe I guess we shouldn’t? But unless you’re going on some sort of digital fast, the results are in: people love using their smartphones. They’re the ultimate hit product of four or five decades of personal computers.

This is not to say the AI Pin doesn’t fit into some interesting niches. A personal constellation of devices—smartphone in your pocket, smartwatch on your wrist, maybe smart earbuds in your ears—gets more interesting when you consider that all of those devices are working together to collect information and communicate it back to you. And none of the devices I carry around daily look at the world around me. The AI Pin has a camera and clips on your shirt, so it’s able to see what you’re seeing and presumably do things with that information.

There’s a lot of potential here. The iPhone can do some amazing things when you hold its camera up—including figure out exactly where it is based on the buildings it can see, which is bananas—but mostly, that camera is looking at the inside of my pocket. Whether it’s glasses or a pin or something else1, there’s valuable data to be gained from seeing the world around us. It’s a sense that our devices are missing most of the time.

The AI Pin’s interface is built around a smart assistant that uses a large language model as an interface. Humane is highly unlikely to corner the market on this technology. Instead, it’s using the same stuff that will immediately permeate into all our devices—at least, as soon as it’s ready.

Humane’s vision for the future of human interfaces doesn’t seem wrong to me. Sooner or later, voice assistants driven by large language models are going to be good and reliable, and the game is going to change. I don’t know if Google Assistant and Alexa and Siri are going to molt into beautiful butterflies next year or the year after or in 2030, but it sure seems like it’s going to happen.

What excites me about this is that what computers are good at, fundamentally, is drudgery. Computer spreadsheets were the first killer app because they eliminated the need to write numbers down on paper, sharpen pencils and erase pencil marks, and do all that math in order to figure out whatever you wanted to learn. My favorite moments using computers are often figuring out ways I can use automation to take a task that requires me to click and type stuff for half an hour and reduce it to a single keyboard shortcut.

Now imagine the prospect of an intelligent assistant that knows every single fact in your personal array of data. It’s read all your saved notes, email, time tracking data, and contacts, and knows what was said in all your meetings and more. Instead of having to invent search terms and search multiple data repositories and wrack your brain to find exactly the right piece of information, the assistant can just do it because it’s got millions of cycles to burn doing the drudgery for you.

Humane’s marketing videos do a pretty good job of showing how this next wave of AI-based assistants will change how we interact with our devices. I think most people will still enjoy tapping on a smartphone, but more complex interactions can be simplified. Kevin Roose of the New York Times wrote about using ChatGPT to create an agent that knew all the rules of his child’s daycare provider, effectively teaching an assistant to answer very specific questions about when Circle Time is and when the facility is closed for the holiday break. Leo Laporte built a programming coach for the Lisp language in less than half an hour.

Sure, it’s all early days. Chatbots still hallucinate—sometimes in their own marketing videos. If my time as a computer user and an observer of the tech world has taught me anything, it’s that new technologies explode into existence quickly—but then take way longer than you expected to get really good. We’re post-explosion now, and things are moving quickly, but it might take a while for this stuff to truly fulfill its potential. (In the meantime, those who are enthusiastic about tech get to play with the new, messy stuff! It’s why it’s fun to be an early adopter.)

Anyway, this brings us to Apple. According to Mark Gurman, iOS 18 will be full of AI features. We’ll see what that amounts to. Apple is a very careful company, and generative AI still feels a bit wild, but given what Adobe’s doing in its apps, it feels like it’s past time for Apple to get involved.

Putting machine-learning-based features in apps, as Apple has been doing for years, is just fine. But it’s obvious that Siri needs to be replaced with something better and smarter and capable of leveraging Apple’s ecosystem to make itself a uniquely personal tool for users of Apple’s devices. My iPhone can read my mail and my notes and peer into my calendar and knows my contacts… it does all this already. The next step is for Apple and Siri to put it all together.


  1. Would ultra-wide-angle lenses on the outside of AirPods be able to stitch together a 360-degree view of the world around you? Only Apple’s product lab knows for sure, I suppose. 

If you appreciate articles like this one, support us by becoming a Six Colors subscriber. Subscribers get access to an exclusive podcast, members-only stories, and a special community.


Search Six Colors