Six Colors
Six Colors

Apple, technology, and other stuff

Support this Site

Become a Six Colors member to read exclusive posts, get our weekly podcast, join our community, and more!

By Joe Rosensteel

Wish List: Siri, Spotlight, and a unified search experience

A screenshot showing the Google search results for scanning and sending a document with Mail. There's a box with the AI summary from Gemini, and then the relevant Apple Support document right underneath.
Maybe this is why Apple executives want Gemini so badly?

There’s a lot of talk recently about Siri being behind the competition. Siri often can’t find what you’re looking for, or what you want to know, and there’s no telling when it might be able to. Many of the requests we make to Siri are basically searches, and when we are unhappy with Siri we turn to search on the web for answers, or in the case of local files or music, we just manually dig it up ourselves.

So here’s a thought for those who might suddenly find themselves in charge of Siri: Search is a foundational element of smart assistants, and the current state of Apple’s search technologies leaves much to be desired.

While all today’s web search engines are placing sparkly and unreliable AI-synthesized answers above everything else, they still generally deliver solid search results underneath. Refining Siri without bolstering the foundation is a recipe for disaster.

Using Siri for search

Apple’s recent announcement that it’s delaying several AI features began with a self-serving sentence about how much people love Siri. You guys know Siri. Among its touted new, revolutionary features was “type to Siri,” a feature that’s not really new (you’ve been able to do that via an accessibility setting for quite a while), but is not a bad idea at all. The problem is that I find myself typing to Siri like I would enter text in a search box. Word choice has a huge impact.

This is inferior to just opening a web browser and typing into a good ol’ fashioned search box. First of all, if you want to ask Siri how to do something, you have to prepend “how” to the request or it might treat your request as something to act on. I also don’t have to worry about a web search engine picking up on a keyword like “email” and trying to compose an email while it ignores the rest of my question just because I didn’t prepend “how.”

Even when Siri parses your words correctly, it’s really that focus on attempting to provide a single result, or perform a single action, that makes it less useful. Like I said at the top, a major factor in the usefulness of any search engine is that you have multiple possible matches for what you entered into that search engine. It’s a powerful tool because you may not have used words that exactly match the title of an Apple Support page, but are close enough that you should consider them.

What if you don’t happen to know the names of all the features for a task you want to do? Let’s say you need to update or change your credit card info. Asking, “How do I change my credit card info?” (See the left of the three iPhone images, below.) It’ll tell me I can do that in the Contacts app (center, below).

Please don’t store your credit card info in the Contacts app. If I ask, “How do I change my payment information?” it’ll tell me to remove a HomePod (that I don’t even own) from the Home app (right, below).

three iPhone screens with confusing Siri output

I have to know the exact words for the three places in Settings where credit card information is stored in order to form a question precise enough that Siri product knowledge will reveal the results for each individual feature I ask for one at a time. If I knew enough to be that specific, then I wouldn’t need to ask.

Searching the web for the same generalized questions works like a charm, but I do have to provide the specific context of the platform I am inquiring about. That’s a key advantage of Siri—it knows the platform I’m on already. When I ask Siri on my iPhone, “How do I scan a document?” Siri is going to return a result relevant for iOS. Unfortunately, it’ll only be the instructions for “Scan Document” in the Notes app instead of all the places in iOS where you can invoke “Scan Document.”

That expectation of context can work against Siri when it doesn’t apply it correctly. A humorous example: If you’re on an Apple TV, and say, “How To Train Your Dragon” into your Siri remote, it will not show you the info for the movie like it would for many other titles, but it will give you some training advice for your dragon. This is the same result you get on all Apple platforms because no context is being used in this instance. Saying “Show me How to Train Your Dragon” (If you’re typing it you need to title-case it or Siri will still give you dragon training advice) will display a list of the movies with that name.

A web search engine doesn’t have this issue, even though it doesn’t have context. It can interpret movie titles before trying to be literal with all the words in a request.

What about Spotlight?

Apple has another brand, Spotlight, that it uses as an umbrella for its various search technologies that return search results, but it’s mostly about finding stuff on your device.

It can’t do natural-language search, though—only Siri gets to do that. If you type a natural language request into Spotlight, it’ll likely put a link to do a web search for your request at the top of the list of search results. It’s not going to parse it into movie, tv show, or song titles unless you happen to have those as files.

That’s a real shame, because it would fit right in with our expectations of searching on the web if we could do that kind of search in Spotlight. Sure, it can still bail to the web, or Siri, if you ask, “Who won the Super Bowl?” but not everything people want to request concerns general knowledge.

Spotlight does a lot of things better than Siri. It displays a ranked list of search results. It live-updates the search results as you continue to type and refine the thing you’re looking for. “Type to Siri” has to digest a complete request, process it, and perform an action or display a blurb.

These two technologies need to work together. Spotlight needs to be able to handle more natural-language requests. Siri needs to be able to display those results when there are multiple, possible, relevant results for a request. We shouldn’t expect Siri, as the magic-sparkle box, to correctly interpret all meaning with no further action required. (Google buries the single-response option under it’s “I’m Feeling Lucky” label. Siri assumes we’re all feeling lucky, all the time.)

Improvements I’d like to find

Providing natural language search can be done in parallel with improving Siri and doesn’t stymy or dismiss the work of that team, and provides both a pressure-release valve and support for whatever Siri is doing.

As a user, I’d like to be able to use natural-language search anywhere there’s a generic search box on an Apple platform, and have the results be predictable. The more context and scope the device also can infer, the better. And offering options to perform different kinds of searches—of the Web, of the Spotlight index, you name it—wouldn’t hurt. That’s the flexibility of providing users with multiple, navigable results instead of a single magic outcome.

I shouldn’t have to turn to a third party like Google to ask about Apple’s platform, especially when Apple just shipped Siri’s product knowledge feature. Apple needs to improve Spotlight, integrate it better with Siri, and provide a more consistent search experience—with options!—across all its devices.

[Joe Rosensteel is a VFX artist and writer based in Los Angeles.]

If you appreciate articles like this one, support us by becoming a Six Colors subscriber. Subscribers get access to an exclusive podcast, members-only stories, and a special community.


Search Six Colors