- By Prateek Levi
- Wed, 21 May 2025 12:23 PM (IST)
- Source:JND
Google I/O 2025 AI Search: Last year at I/O Google introduced AI Overviews, and since then it has caused a significant shift in the trend of how people use Google Search. It's that AI preview answer you get when you type and search for more complex queries that are longer and more multimodal in nature. AI in search is a revolution, as it allows people to ask questions in the search engine itself and get a useful response which is further attached with links to provide the source so you can check it out on your own. This has made AI Overviews one of the "most successful launches in the search in the past decade", according to the tech giant.
Currently the market is filled with AI chatbots and assistants; this has prompted Google to redesign its search through artificial intelligence (AI). This was anticipated, as how people search these days is changing; rather than going directly for the search item, people are usually asking queries that require some complex computing. Another reason for Google to take this step is its rivals, like ChatGPT and Perplexity, that are getting more popular, and to answer that, Google is introducing AI Mode in search. This is supposed to offer an end-to-end AI search experience.
Pichai said, "With more advanced reasoning, you can ask AI Mode longer and more complex queries. In fact, early testers have been asking queries that are two to three times the length of traditional searches, and you can go further with follow-up questions. All of this is available as a new tab right in Search."
This new AI Mode has been launched for all users in the US at the Google I/O 2025 developers conference on May 20.
What Is The AI Mode All About?
Google had started testing this feature earlier this year itself in its labs, and now it is rolled out for all users in the US. It uses the company's most advanced reasoning and multimodal capabilities and has the ability "to go deeper with follow-up questions and links to the web" according to the tech giant.
Image Credits: Google
On the inside, AI mode utilises something called a "query fan-out technique". It breaks down a question any user has asked intoics and issues and issues a multitude of queries simultaneously on the user's behalf. Allowing the search to take a deeper dive and really plunge into it is a much more thorough search than the traditional search.
This feature builds on Google's AI-powered search experience called AI Overviews, which debuted last year. AI Overviews offers AI-generated summaries and deep links at the top of search results. Until now, it has relied on a custom version of Gemini 2.0, but it's now being upgraded to Gemini 2.5, introduced in March 2025.
Google is also widening the reach of AI Overviews, which now serves over 1.5 billion monthly users. It’s available in more than 200 countries and supports 40 languages.
"AI Mode is where we’ll first bring Gemini’s frontier capabilities, and it’s also a glimpse of what’s to come. As we get feedback, we'll graduate many features and capabilities from AI Mode right into the core search experience. Starting this week, we're bringing a custom version of Gemini 2.5, our most intelligent model, into Search for both AI Mode and AI Overviews in the U.S." Said Elizabeth Reid, VP Head of Search.
Apart from this, Google is also introducing a lot more to its search. Google is introducing several new AI-powered features in Search, aimed at making everyday tasks more efficient — all while keeping users in control. These updates build on existing tools like AI Mode, Google Lens, and the company’s Gemini models.
Deep Search for Thorough Research
For users looking for in-depth answers, Deep Search adds advanced research capabilities to AI Mode. It builds on the existing query fan-out method but extends it significantly. The system can run hundreds of searches, synthesise information from varied sources, and produce a well-cited, expert-level report within minutes — potentially replacing hours of manual research. This feature is still in development, and final results may vary.
Search Live: real-time help through your camera
Visual search also sees an upgrade. Google Lens already supports over 1.5 billion monthly users who search with their camera, and now, Search is adding real-time functionality from Project Astra. With the new "Live" feature in AI Mode or Lens, users can point their camera and ask questions on the spot. The tool can respond in real time — explaining visual elements, offering suggestions, and linking to helpful resources including websites, videos, and forums. As with other features still under development, output may vary.
Agentic tools for task automation
Google is also integrating agent-like capabilities into AI Mode via Project Mariner. For example, users looking to buy event tickets can simply ask for specific options — like “Find 2 affordable tickets for this Saturday’s Reds game in the lower level.” The system will scan available listings in real time, handle form inputs, and return relevant choices. The actual purchase takes place on the user’s chosen site. These tools will first support ticketing, restaurant reservations, and local appointments, in collaboration with platforms like Ticketmaster, StubHub, Resy, and Vagaro.
Image Credits: Google
AI-assisted shopping with personalised try-ons
A redesigned shopping experience in AI Mode combines the Gemini model with Google’s Shopping Graph. It allows users to browse, evaluate, and narrow down products with more context. Virtual try-ons are also supported — users can upload a photo to see how clothes might look on them. When ready to buy, a checkout assistant can make the purchase through Google Pay once the price aligns with user preferences. The user stays in charge throughout the process.
Personalised suggestions based on search history
AI Mode will soon support personalised search results. If users opt in, it can draw from prior searches and integrate data from other Google apps like Gmail. For instance, searching for “things to do in Nashville this weekend with friends; we're big foodies who like music” may bring up restaurant options based on past bookings and nearby events linked to flight or hotel confirmations. These integrations are optional and clearly marked, with full control over what’s connected.
Data visualisations on demand
Finally, AI Mode will help visualise data through custom charts and graphs. If a user wants to compare something like home-field advantage between two baseball teams, Search will interpret the request and generate a relevant interactive graph. This feature will initially support topics in sports and finance.
Image Credits: Google
Together, these updates are designed to support a wide range of user needs — from quick answers to more complex, contextual help — while offering transparency and choice throughout.
ALSO READ: Nothing Phone 3 Launch Confirmed in July: Checkout Expected Price, Specs, and Leaked Specifications