Google’s doing a lot of things with AI and machine learning, many of them pretty impressive. But from an Apple perspective, I found this portion of Harry McCracken’s interview with google senior VP John Giannandrea interesting:
TensorFlow Lite, a new offshoot of Google’s open-source TensorFlow software for creating machine-learning applications, runs directly on Android devices, enabling features such as new features in Android O that are smart enough, for instance, to notice that the text you’re trying to highlight is an address. It wouldn’t work nearly as smoothly if it had to talk to a server across an internet connection.
“You want to do machine learning on the device as much as you possibly can,” Giannandrea says. “It’s lower latency, it’s closer to the user, it’s distributed.”
What’s interesting is that, due to privacy issues, this has been Apple’s approach all along—doing machine learning on the device, rather than in the cloud. Google’s now driving in the same direction, not because of a lack of prowess in cloud AI, but it’s better for users if as much of that stuff runs locally as possible.