advertisement
Facebook
X
LinkedIn
WhatsApp
Reddit

Google is testing a feature called Multisearch in the Google app

Google has added some new functionality to its self-titled app that may allow users to find things they’re searching for a little faster.

The Google app now allows users to search using both images and text within Google Lens. This means you could snap a photo and add text in order to refine your search a bit more.

The feature is called Multisearch and it’s available as a beta test within the Google app on Android and iOS. Unfortunately, the beta test is only taking place in the US so we may have to wait a while to see it in other parts of the world.

What makes this feature special is that users will be able to glean more information than just what the image is and similar images. Google says that for now the best results from Multisearch come from shopping queries, but it can be used to find things such as how best to care for a plant, even if you don’t know what plant it is. Users can also ask questions about what they see.

On that note, if you aren’t using Google Lens, you really should. Searches with Google Lens can help provide additional context to images you snap, as well as translations and even shopping options. If you plan on travelling now that restrictions have lifted, the Google app is a key app to install.

“All this is made possible by our latest advancements in artificial intelligence, which is making it easier to understand the world around you in more natural and intuitive ways. We’re also exploring ways in which this feature might be enhanced by MUM– our latest AI model in Search– to improve results for all the questions you could imagine asking,” product manager for Search at Google, Belinda Zeng wrote in a blog.

As mentioned, Multisearch is currently being beta tested, but we’re going to keep an eye out for this feature to become more widely available.

advertisement

About Author

advertisement

Related News

advertisement