
Some of it, however, has to do with what using Google is like on platforms the company doesn’t control. I remember when a Google search only returned text, and video wasn’t the default way to learn something. Speaking personally, my first instinct is to never pull out my phone’s camera when I’m looking for information. But they do require you to think in a different way. There’s a good chance that adding features like these to search or Google Maps could be more meaningful to the average person than anything Google Assistant can do. Once you find what you’re looking for and tap on it, Maps can immediately jump into AR directions to get you the rest of the way there. If you’re in a new city (the feature is rolling out in London, Los Angeles, New York, Paris, San Francisco, and Tokyo) you can pull out your phone, open up Google Maps, hop into Live View, and get common Maps searches (nearby restaurants, ATMs, gas stations, etc.) overlayed onto the world around you. (Yes, people were obsessing over AR in the early 2000s, too.) Google Maps Live View is basically that concept but leveled up.
GOOGLE PHOTOS SEARCH 2 PEOPLE SOFTWARE
Yelp introduced Monocle in 2009 as a rudimentary software feature that layered Yelp results onto the world around you. If you used to be an avid Yelp user, you might be familiar with Google Maps’ Live View. Live View overlays result on the world around you in Google Maps. Searching for locations in the world around you If you want truffle mac and cheese, you can find truffle mac and cheese, one way or another.

Google says typing in a search for a specific dish will also pull up nearby restaurants that serve it. That doesn’t mean traditional text-based searches aren’t getting improved as well.
GOOGLE PHOTOS SEARCH 2 PEOPLE PLUS
In an arena like clothing, you might want a variation on the thing you’re photographing rather than an exact replica (a photograph of a blue button-down plus the text “green,” for example).

Google previously applied multisearch to fashion, where identifying something seems equally complex. The idea is that searching that way should be faster than trying to find the name of the dish or business that makes it (imagine having to look through a bunch of menus) while also hopefully being more accurate. In Google’s example, you can open up the Google app, snap a photo or upload a screenshot of a specific dish, type in “near me,” and then see every restaurant or store that sells the dish nearby. The first change Google’s making is to its “multisearch” feature - searches that combine images and text for more nuanced results - so it can now find food near you. Snap a photo of some food and find out where you can buy it nearby.
