While text-based search results continue to be the most widespread method when looking for information online, the advantage of smartphone cameras has ushered in a new wave of image-based search queries. Google addressed many of these queries with the help of Google Lens — the company’s image recognition tech — which is used more than 10 billion times monthly.
While consumers can already use Lens right from the Google Search bar, a new update coming to Google Lens will allow users to search for anything that appears on their screen. The new “search your screen” update for Google Lens lets you find more information about objects and places that appear on the screen. The feature works across all apps, and gives users the information without having to exit the app. To invoke the feature, users need to prompt Google Assistant by long-pressing the home button (or power button) and tapping the “search screen” option.
The second update to Google Search came in the form of the “multisearch” feature, which lets users combine two search types in a single query. This lets users search with a picture and add a few lines of text about the picture to get more accurate, contextualized search results. In the U.S., multisearch has also been enhanced with a new feature that lets users look for an image and check for the availability of the same at local businesses.
Google confirmed that its new AI-focused search enhancements are being rolled out worldwide, and can be used in all countries where Google Lens is available.
For all the latest Games News Click Here
For the latest news and updates, follow us on Google News.