Seattle, USA – Jan 22, 2022: The new Google campus in the south lake union early in the evening.
Although they do not integrate a chatbot into their search engine, Google also uses AI to steer your searches in the right direction.
It’s hard to ignore in recent weeks: generative models like ChatGPT are everywhere. The chatbot and AI in general can therefore change how we look at the internet and how we use it. Microsoft integrated ChatGPT into Bing search engine and Edge browser. Google is not left behind, and is implementing AI in its search functions in a completely different way.
Visual search
One of the ways Google uses AI is to facilitate visual search. Today it is already possible to make a visual search with Google Lens. You then upload an image or take a photo to start a search. That functionality will soon be expanded and integrated into Android smartphones.
In the near future it will be possible to start a search directly from the screen of your smartphone. That sounds banal, as if we already do it every day, but it revolutionizes the way we search. You no longer need to open a Google Search or Lens app to start a search. Instead, you’ll be able to hold down your device’s lock button to summon Google Assistant. It will now give you the option to ‘search’ on your screen. This way you can search for what you need directly from your favorite website, messaging or video app.
In addition, Lens is getting an upgrade with ‘combined search’ or multisearch. Do you see a coffee table that you like, but that you would like to have in a different shape? You can upload a photo of a round coffee table in Google Lens and enter the search term ‘rectangular’. Your search results will then consist of square coffee tables similar to the one you uploaded.
AI in Google Maps
Google Maps also uses AI in different ways. For example, Google introduces ‘Immersive View’, a way to digitally go to a place. To accomplish the feature, AI combines Street View and aerial imagery into a digital model of the world. To do that, Google uses neural radiation fields, an advanced AI technique that can create 3D models based on photos. Immersive view allows you to walk around through your phone’s screen.
Are you on location, but can’t find your way? AR and AI will also help you find your way. With Google Live View you can just hold up your phone. You will see all kinds of signposts appear on the screen to help you navigate. The feature is currently only available in some major cities such as London, Los Angeles, New York and Tokyo. Google is working on a rollout of Live View, but it may take a while before Belgium also gets Live View. The feature will be coming to Barcelona, Dublin and Madrid in the coming months. So the feature will be especially useful if you’re traveling in a city where Live View is available.
Equally useful when traveling is a similar feature for indoors. Or rather: indoor airport, since the function is mainly aimed at that. Can’t find your way in an airport or train station? Then you can activate Live View to make turn signals appear on your phone screen.
Electric cars
Finally, AI makes it a lot more pleasant to drive around in an electric car. If Google Maps is integrated in your car, the map app automatically suggests loading breaks. It is not presented to you haphazardly: it is checked what the traffic conditions, your charge level and energy consumption roughly prescribe. Maps therefore calculates the ideal times and ideal places to take a charging break.
Other Google projects
Google also has plenty of other projects in which AI plays a role. For example, they recently announced the existence of MusicLM, an AI bot that can make music. AI also plays a role in projects such as Blob Opera: an AI model interprets the voices of opera singers and lets you play around with them yourself.
The 3D models, which are mainly made for educational purposes and are processed in Google Arts & Cultures, are also made using AI.