Skip to content

Google's AI Mode Brings Conversational Visual Shopping to U.S. Users

Shop with your eyes and words. Google's new AI Mode lets you find products by image and refine results with follow-up questions.

In the foreground of this image, there is text and images of television, people, arrows and...
In the foreground of this image, there is text and images of television, people, arrows and dialogue boxes.

Google's AI Mode Brings Conversational Visual Shopping to U.S. Users

Google has rolled out an update to its AI Mode this week, allowing U.S. users to shop conversationally using visual search. This feature is initially available to some Google AI Pro and Ultra subscribers in the U.S., with no release date announced for other regions.

The new AI Mode supports visual search, combining images and natural language in conversations. Users can start a search with text or an image and refine results with follow-up questions. This feature is powered by Gemini 2.5's advanced multimodal and language understanding.

On mobile devices, users can search within a specific image and ask conversational follow-ups about what they see. Google uses a technique called 'visual search fan-out' to better understand images and user queries. Each image in the search results links to its source.

Google's Shopping Graph, spanning over 50 billion product listings, is updated every hour with details like reviews and stock status. This allows users to shop conversationally without using conventional filters. Google provides an example of searching for 'maximalist bedroom inspiration' and refining it with 'more options with dark tones and bold prints'

The new Google AI Mode's visual search feature is rolling out this week in English in the U.S., initially to some Google AI Pro and Ultra subscribers. It allows users to shop conversationally using images and text, with results linked to their sources. No release date has been announced for other countries or regions outside the USA.

Read also:

Latest