How Google Photos Makes It Easier to Find Your Favorite Photos
Google Photos has introduced a new feature that makes it easier to search for specific photos and videos. On Thursday, the Mountain View tech giant announced the rollout of Descriptive Queries, an improved version of the app’s native search functionality. It allows users to type a query in plain language and find more specific results. The feature is rolling out to all users on iOS and Android. Notably, the company is also giving US users early access to a yet-to-be-released AI-powered Ask Photos feature that uses Gemini to find images.
Google Photos updated with support for descriptive searches
In a blog postthe tech giant announced the Descriptive Queries feature. The feature is an improvement on Google Photos’ native search functionality, which is present at the bottom of the app. Previously, users could search for specific images by adding relevant keywords. However, there was no way to further refine the search results.
But with the improvement, users will be able to type natural language queries as long as a sentence to find specific images and videos. For example, searching for “Alice and I laughing” or “Kayaking on a lake surrounded by mountains” will return relevant results. Google said users will also be able to use vague language in their searches.
Notably, this isn’t a Gemini-driven feature, and it’s rolling out to all users on iOS and Android. Descriptive Queries will only work if the user’s smartphone language is set to English, however. Google says the search experience will be rolling out to more languages in the coming weeks.
Additionally, Google Photos lets users sort results by date or relevance, making it even easier to find the right image.
Google is also rolling out early access to a recently introduced AI-powered feature for finding images and videos within the app, called Ask Photos. Powered by Gemini, it allows users to use the AI by typing out conversational prompts. Gemini uses computer vision to process a user’s Google Photos library and surface relevant images.
It’s an experimental feature that’s part of Google Labs. It’s currently available in early access for select users in the US. The company said it will roll out to more users later this year.