Google is currently testing its innovative “Ask Photos” feature, allowing users to interact with their picture libraries in new and exciting ways. First previewed in May, this feature is now available to select Google Labs users in the U.S. With it, you can ask questions like, “Where did we camp last time in Yosemite?” or “What did we eat at the hotel in Stanley?” Powered by Google’s Gemini AI models, the Photos app provides responses based on your photo content and retrieves relevant images.
Additionally, Google is integrating an AI assistant into Photos, enhancing your ability to summarize experiences from recent vacations or select the best family photos for a shared album. Users can sign up for the waitlist on Google’s website to access this feature.
When using Ask Photos, you also have the option to switch back to the traditional “classic search.” This classic mode has been upgraded to allow natural language queries, such as “Alice and me laughing” or “Kayaking on a lake surrounded by mountains.” Users can then sort image results by date or relevance. This feature is rolling out in English for both Android and iOS, with support for additional languages set to arrive in the coming weeks.
To accommodate these changes, Google Photos has replaced the Library tab with a new Collection page, which is designed to streamline the process of finding photos and videos. Although I haven’t explored the new tab extensively yet, I’m eager to utilize the natural language search. This will enable me to locate specific images more efficiently, without needing to scroll through countless photos or filter them by location.