TechnologyGoogle introduced multi-search near me feature can be very useful

Google introduced multi-search near me feature can be very useful

Local multiple search and visual exploration can help up your lens search game

When Google was born in 1998, Google’s first product was search, and to this day it is widely used, serving billions of people and processing countless queries daily.

It is one of the services that Google works to improve every day.

Google has launched I/O 2022, and Google has announced some changes to its most recent search engine innovation, MultiSearch, that will definitely improve your search and online shopping experience.

About a month ago, Google introduced us to the concept of multiple search.

By enhancing the lens’s image recognition capabilities, MultiSearch allows you to search for an image and add additional context to that subject with additional text, helping you steer your search in the right direction.

Although it’s not out all over the world yet, it’s already getting massive improvements at the Google Developer Conference.

MultiSearch was used as the virtual equivalent of pointing to something and asking a friend if they had seen it somewhere.

But a friend will inform you when you search for the nearest store instead of showing you a link to buy online.

Although online shopping is more popular than ever, you’ll want to check out a few more things, so Lens wants to do what your friend does.

Using Dental Search, you can now search local information from Google’s vast list of millions of businesses.

Just look at the image or screenshot you’re looking for, add “near me” as an additional text reference, and Google will try to find what you’re looking for from businesses closest to you.

This is great if you’re searching for a dress or gown that you want to try on before you buy it or you’re looking for a meal or dessert that looks delicious but you don’t know what it’s called.

Now use Lens to see if there are places near you that sell the things you need, and walk or drive to see for yourself.

Currently, using Lens Search, you can identify an object in a frame and see if Google can find it somewhere on the web.

While this is already very useful, especially in multicurrency and local multicurrency, if you want to scan multiple things at once, you have to do it one by one; You can get tired quickly.

That’s why Google adds a whole new angle to visual search in many of the searches.

Visual inspection allows you to operate your camera in a wider view and get an idea of ​​the many things in the frame.

If you bought a nut-free candy bar for a friend and don’t know exactly what to buy, here’s an example of how Google can help you price it. What is right.

Local Multisearch will be available in English later this year and is expected to be available in multiple languages ​​by 2023.

Based on visual analysis, Google has not provided any timeline for its release, but it is expected to release in the future.

Related Stories