Google is giving its search engine and maps major updates. Here are 3 key takeaways about what’s coming

May 11, 2022, 6:15 PM UTC
What to know about Alphabet's major changes to Google search, maps, and A.I.-powered assistant
Geert Vanden Wijngaert—Bloomberg/Getty Images

Google’s search engine as well as Google Maps and voice-activated Google Assistant are getting major upgrades.

Google announced the updates Wednesday during its annual Google I/O developer conference, held virtually this year due to the COVID-19 pandemic. 

The tweaks are intended to help keep up with changing user habits and make searching more intuitive and easier, Google senior vice president Prabhakar Raghavan told Fortune. Children, young adults, and others coming online now for the first time don’t necessarily know to type in two keywords in the search box, he said, and that therefore “We cannot be serving the same queries and the same needs we did 20 years ago.”

Here’s three takeaways from Google’s announcements:

Search will be more visual and local

People will be able to enter more complicated queries into Google’s core search using their smartphone cameras sometime later this year. With the new feature, people will be able to take photos or capture screenshots online of goods like clothing, food, and home appliances, and retrieve a list of all the nearby restaurants and stores that sell those items.

Google had debuted a similar function via its Google Lens product. But Raghavan characterized that as experimental; the Lens technology is now good enough to be incorporated into Google’s core search.

“Now we feel good enough about it so that every search bar, whether your iPhone or Android, has of course a keyword bar, but also a camera and a microphone,” Raghavan said.

Google also said it is developing a new feature, which does not have a release date, called “scene exploration” that will let people more quickly retrieve information about objects they view through their smartphone cameras. The company said that users of the new feature will be able to scan an entire store shelf, for instance, and retrieve details about every item rather than having to focus on just one item at a time.

Now you can go inside restaurants using Google Maps 

Google Maps is getting a feature called “immersive view” that will let people visit stores and iconic buildings and attractions in cities like San Francisco and Los Angeles in 3D. Unlike Google’s traditional 3D map view, immersive view is much more detailed, and uses a combination of aerial imagery captured by drone, satellite imagery, and on-the-ground visuals. In a demonstration prior to the conference, executives showed how the new feature will let users virtually enter certain stores like cafes so they can inspect the interiors before they decide they want to visit in real life. 

People will also be able to see what popular world attractions like London’s Big Ben look like at different times of the day and in different weather conditions. 

The immersive view feature will debut later this year for a few cities including Los Angeles, San Francisco, London, New York City, and Tokyo. 

Talking to an A.I. assistant should become more natural 

Similar to competitors like Amazon, Google is trying to make interacting with its voice-activated Google Assistant more of a natural experience akin to conversing with an actual human rather than a chatbot that frequently misunderstands what people tell it.

The company’s “Look and Talk” feature will let customers of Google’s Nest Hub Max smarthome display activate Google Assistant by merely looking at the device’s digital screen and then asking a question, so they don’t have to activate the assistant by saying “Hey Google.” Google executives said that the feature only works for people who have consented to have their voices and faces analyzed by the device “so some random person cannot walk into your house and turn on the lights or whatever,” Raghavan said.

Nest Hub Max users will also be able to command Google Assistant to turn off Internet-connected home lights or set timers without having to say “Hey Google” to initiate the action. The goal is to reduce the number of times people need to say, “Hey Google,” which can be annoying.

Eventually, Google plans to update its Google Assistant so that the software doesn’t stumble when people talk naturally to it, such as when they take long pauses or utter “ums.”

Sign up for the Fortune Features email list so you don’t miss our biggest features, exclusive interviews, and investigations.

Read More

Artificial IntelligenceCryptocurrencyMetaverseCybersecurityTech Forward