Interesting piece from Kay Yin on Medium, describing the list of things that can apparently be recognized by the new Advanced Computer Vision in Photos, which Apple demoed at WWDC last week:
Photos app supports detecting 4,432 different scenes and objects. These scenes or objects can be searched for in all languages.
Additionally, you can search for various landmarks. For example, Photos can respond for search query of “Maho” (beach in Saint Martin), despite Photos is not programmed or trained to understand specific landmarks. Behind the scenes, Photos app first generates a generic categorization for the scene, “beach”, then searches through a built-in dictionary for all landmarks that has the name “beach” in its definition. Therefore, cleverly, despite Photos app knows nothing about “Maho” in particular, it is still able to return the right results. The same applies to nature scenes, water scenes and urban scenes.
There’s a truly huge list of all the things that Photos will be able to search for, from the everyday–”Corgi” and “Teapot”–to the somewhat less common–”Sousaphones” and “Bivouacking.” Also included are the seven facial expressions Photos can apparently recognize–Greedy, Disgust, Neutral, Scream, Smiling, Surprise, Suspicious–and the 33 categories of Memories.
Be right back: going to search for “suspicious corgis.”