The Unknown Effect of Google A.I. Overviews on Search Traffic

Pew Research Centre made headlines this week when it released a report on the effects of Google’s A.I. Overviews on user behaviour. It provided apparent evidence searchers do not explore much beyond the summary when presented with one. This caused understandable alarm among journalists who focused on two stats in particular: a reduction from 15% of searches which resulted in a result being clicked to just 8% when an A.I. Overview was shown, and finding that just 1% of searches with an Overview resulted in a click on a citation in that summary.

Beatrice Nolan, of Fortune, said this was evidence A.I. was “eating search”. Thomas Claburn, of the Register, said they were “killing the web”, and Emanuel Maiberg, of 404 Media, says Google’s push to boost A.I. “will end the flow of all that traffic almost completely and destroy the business of countless blogs and news sites in the process”. In addition to the aforementioned stats, Ryan Whitwam, of Ars Technica, also noted Pew found “Google users are more likely to end their browsing session after seeing an A.I. Overview” than if they do not. It is, indeed, worrisome.

Pew’s is not the only research finding a negative impact on search traffic to publishers thanks to Google’s A.I. search efforts. Ryan Law and Xibeijia Guan of Ahrefs published, earlier this year, the results of anonymized and aggregated Google Search Console data finding a 34.5% drop in click-through rate when A.I. Overviews were present. This is lower than the 47% drop found by Pew, but still a massive amount.

Ahrefs gives two main explanations for this decline in click-through traffic. First, and most obviously, these Overviews present as though they answer a query without needing to visit any other pages. Second, they push results further down the page. On a phone, an Overview may occupy the whole height of the display, as shown in Google’s many examples. Either one of these could be affecting whether users are clicking through to more stuff.

So we have two different reports showing, rather predictably, that Google’s A.I. Overviews kneecap click rates on search listings. But these findings are complicated by the various other boxes Google might show on a results page, none of which are what Google calls an “A.I.” feature. There are a slew of Rich Result types — event information, business listings, videos, and plenty more. There are Rich Answers for when you ask a general knowledge question. There are Featured Snippets that extract and highlight information from a specific page. These “zero-click” features all look and behave similarly to A.I. Overviews. They all try to answer a user’s question immediately. They all push organic results further down the page. So what is different about results with an A.I. twist?

Part of the problem is with methodology. That deja vu you are experiencing is because I wrote about this earlier this week, but I wanted to reiterate and expand upon that. The way Pew and Ahrefs collected the data for measuring click-through rates differs considerably. Pew, via Ipsos KnowledgePanel, collected browsing data from 900 U.S. adults. Researchers then used a selection of keywords to identify search result pages with A.I. Overviews. Ahrefs, on the other hand, relied on data directly from Google Search Console automatically provided by users who connected it to the company’s search optimization software. Ahrefs compared data collected in March 2024, pre-A.I. rollout, against that from March 2025 after Google made A.I. Overviews more present in search results.

In both reports, there is no effort made to distinguish between searches with A.I. Overviews present and those with the older search features mentioned above, and that would impact average click-through rates. Since Featured Snippets rolled out, for example, they have been considered the new first position in results and, unlike A.I. Overviews in the findings of Pew and Ahref, they can drive a lot of traffic. Search optimization studies are pretty inconsistent, finding Featured Snippets on between 11%, according to Stat, and up to 80% according to Ahrefs.

But the difference is even harder to research than it seems because A.I. Overviews do not necessarily replace Featured Snippets, nor are they independent of each other. There are queries for which Overviews are displayed that had no such additional features before, there are queries where Featured Snippets are being replaced. Sometimes, the results page will show an A.I. Overview and a Featured Snippet. There does not seem to be a lot of good data to disentangle what effect each of these features has in this era. A study from Amisive from earlier this year found the combined display of Overviews and Snippets reduced click-through rates by 37%, but Amisive did not publish a full data set to permit further exploration.

But publishers do seem to be feeling the effects of A.I. on traffic from Google’s search engine. The Wall Street Journal, relying on data from Similarweb, reported a precipitous drop in search traffic to mainstream news sources like Business Insider and the Washington Post from 2022 to 2025. Similarweb said the New York Times’ share of traffic coming from search fell from 44% to 36.5% in that time. Interestingly, Similarweb’s data did not show a similar effect for the Journal itself, reporting a five-point increase in the share of traffic derived from search over the same period.

The quality of Similarweb’s data is, I think, questionable. It would be better if we had access to a large-scale first-party source. Luckily, the United States Government operates proprietary analytics software with open access. Though it is not used on all U.S. federal government websites, its data set is both general-purpose — albeit U.S.-focused — and huge: 1.55 billion sessions in the last thirty days. As of writing, 44.1% of traffic in the current calendar year is from organic Google searches, down from 46.4% in the previous calendar year. That is not the steep decline found by Similarweb, but it is a decline nevertheless — enough to drop organic Google search traffic behind direct traffic. I also imagine Google’s A.I. Overviews impact different types of websites differently; the research from Ahrefs and Amisive seems to back this up.

Google has, naturally, disputed the results of Pew’s research. In an extended comment to Search Engine Journal, the company said Pew “use[d] a flawed methodology and skewed queryset that is not representative of Search traffic”, adding “[we] have not observed significant drops in aggregate web traffic”. What Google sees as flaws in Pew’s methodology is not disclosed, nor does the company provide any numbers to support its side of the story. Sundar Pichai, Google’s CEO, has even claimed A.I. Overviews are better for referral traffic than links outside Overviews — but, again, has never provided evidence.

Intuitively, it makes sense to me that A.I. Overviews are going to have a negative impact on click-through rates, because that is kind of the whole point. The amount of information being provided to users on the results page increases while the source of that information is minimized. It also seems like the popular data sources for A.I. Overviews are of mixed quality; according to a Semrush study, Quora is the most popular citation, while Reddit is the second-most popular.

I find all of these studies frustrating and it is not necessarily the fault of the firms conducting them. Try as hard as the search optimization industry has, we still do not have terrifically reliable ways of measuring the impact each new Google feature has on organic search traffic. The party in the best possible position to demystify this — Google — tends to be extremely secretive on the grounds it does not want people gaming its systems. Also, given the vast disconnect between the limited amount Google is saying and the findings of researchers, I am not sure how much I trust its word.

It is possible we cannot know exactly how much of an effect A.I. Overviews will have on search trafic, let alone that of “answer engines” like Perplexity. The best thing any publisher can do at this point is to assume the mutual benefits are going away — and not just in search. Between Google’s legal problems and it fundamentally reshaping how people discover things in search, one has to wonder how it will evolve its advertising business. Publishers have already been prioritizing direct relationships with readers. What about advertisers, too? Even with the unknown future of A.I. technologies, it seems like it would be advantageous to stop relying so heavily on Google.