By Philip Michaels
February 3, 2026 12:00 PM PT
What Android phones can teach the iPhone

For the past decade and change, I’ve tested and reviewed a large number of Android phones, augmenting the iPhone expertise I’ve built up since seeing the very first iPhone live and in person at Macworld Expo 2007. And while that parade of phones has included some winning devices, my overall impression of the Android experience falls somewhere along the lines of “How do people live like this?”
That’s largely a reflection of the haphazard way Android phones receive their software updates. Some, like Google’s own Pixel devices, get new features right away, while others see updates once phone makers and wireless carriers are good and ready to release them. As someone used to downloading iOS updates the moment they’re available, that throws me. Android partisans tell me I’m being silly — sometimes politely, sometimes less so — and they may well have a point.
But even if some elements of the Android experience don’t land with me, I’d have to be a pretty narrow-minded person not to appreciate the features that do deliver. Android phones get a lot of things right — and some of those are missing in action when it comes to the iPhone.
Look, Apple didn’t sell more than $201 billion in iPhones during its 2024 fiscal year by listening to my advice, and I certainly don’t expect the company to start casting sideways glances toward Android phone makers to surreptitiously gather ideas on how to spruce up the next iPhones. But I do think there’s some merit in looking at areas where Android phones excel and how adopting something similar might give the iPhone a boost.
After all, we have some evidence that this already happens to some degree. Google added a rather distinctive horizontal camera bar to the Pixel 6 back in 2021. And while I don’t think the iPhone 17 Pro’s extended camera array is a direct copy, it certainly seems to draw some inspiration from what the Pixel has offered for years. Bringing the new look to the iPhone also allowed Apple to shift around internal components so that the current iPhone can benefit from a bigger battery and a vapor chamber, so there are benefits to adopting, adapting and improving.
So here’s what I’d flag up from my time looking at Android phones for features I’d like to see on the iPhone.
More photo and video capture tools
One of the best rivalries going right now involves Apple and Google battling it out to see whose phones can produce the best photos. It’s a pretty even match-up that seems to shift every time one of the companies rolls out a new flagship device, but I do think Google’s use of computational photography to produce high-quality images gives it an edge over Apple.
Some of this involves long-standing capabilities like the Super Res Zoom feature that’s been a part of Pixel phones for years. It cleans up digital zooms so that they don’t have the noise and fuzziness that can creep into a shot the closer you zoom in. I’ve also been impressed by features like Best Take, where the Pixel uses multiple exposures in group shots to make sure everyone’s looking their best.
I’d especially like to see Apple try its hand at a version of Pixel camera features, such as Add Me or Camera Coach. Add Me, debuting with 2024’s Pixel 9 release, lets you insert yourself into group photos you take, using AR overlays to show you where to stand and then stitching together the photos by tapping into AI. My results with Add Me on Pixel phones tended to be hit-or-miss — often, the final output depends heavily on who you hand the camera phone off to — but I bet Apple could make the process a little bit more foolproof.
Similarly, the Pixel 10’s Camera Coach feature isn’t flawless. But I like the concept behind it, as Camera Coach uses the phone’s Gemini assistant to make real-time suggestions on how to take a better shot. Once Apple gets its act together with Siri — and more on that in a bit — this is something the iPhone could easily offer.
A Pixel Screenshots-style app for the iPhone
I take a lot of screenshots on my phone — sometimes to preserve information I want to remember, sometimes to chronicle how-to steps for an article. And when I use a Pixel phone, I like that there’s an app that lets me stay on top of those screenshots, rather than letting them live unsorted on my camera roll.
I’m speaking of the Pixel Screenshots app, introduced with the Pixel 9, that collects all the screenshots you’ve captured. But it also lets you do more than that, like set reminders if there’s a specific action you want to take that’s related to that screenshot (even if it’s something as basic as “remind me to enter this recipe I’ve found into my database of recipes when I have more time.”) You can group screenshots into collections, too — handy for keeping those how-to screenshots in one place.
My favorite part of the Screenshots app is that those screenshots are now searchable — as in, the Pixel’s on-board smarts allow you to search for screenshots based on the information they contain. It saves me the trouble of having to remember when I captured a particular screenshot by just bringing up what I’m looking for, when I want it.
I feel like Apple is taking steps in this direction by expanding Visual Intelligence in iOS 26 to work with screenshots — adding calendar entries based on times and dates in a screenshot, translating words in a screenshot and running web searches on objects included in a screenshot. Cataloging the contents of those screenshots in a single app feels like the next natural step in Visual Intelligence’s evolution.
A smarter Siri that works across apps
I don’t have to tell you that Siri needs to get a lot smarter, a lot faster. But if you’ve had a chance to use the Gemini Assistant on any recent Android flagship, you know how far behind Apple is when it comes to a truly intelligent assistant.
Hopefully, the iOS 26.4 update and its rumored updates to Siri will get Apple back in the game, though I don’t think it would be reasonable to expect Apple to catch up with Android with just one software update. If Apple is truly serious about making a go of Apple Intelligence and its suite of AI tools, it would be wise to work toward what Samsung currently offers with its cross-app actions through Galaxy AI.
Samsung introduced cross-app actions just about a year ago, and they really show off what on-device AI can offer, even to an AI skeptic like myself. With this feature, you can issue one command to your on-device assistant — “find me a nearby BBQ joint and text the address to my good pal Jason,” for example — and your assistant carries out that task across multiple apps. There’s no need for repeated commands or clarifications — just tell the assistant what you want done, and it goes and carries out the multi-step task.
Outside of some of the camera features mentioned above, this is probably the Android capability I miss the most when I’m back in my familiar iPhone terrain. And it’s something Apple needs to adapt on its own sooner rather than later.
[Philip Michaels has been writing about technology since 1999, most notably for Macworld and Tom’s Guide. He currently finds himself between jobs, so if you need someone who can string a few sentences together (or make your sentences read a lot better), drop him a line.]
If you appreciate articles like this one, support us by becoming a Six Colors subscriber. Subscribers get access to an exclusive podcast, members-only stories, and a special community.