Google Comments on Its Sloppy Summaries blog.google

Liz Reid, head of Google Search, on the predictably bizarre results of rolling out its “A.I. Overviews” feature:

One area we identified was our ability to interpret nonsensical queries and satirical content. Let’s take a look at an example: “How many rocks should I eat?” Prior to these screenshots going viral, practically no one asked Google that question. You can see that yourself on Google Trends.

There isn’t much web content that seriously contemplates that question, either. This is what is often called a “data void” or “information gap,” where there’s a limited amount of high quality content about a topic. However, in this case, there is satirical content on this topic … that also happened to be republished on a geological software provider’s website. So when someone put that question into Search, an AI Overview appeared that faithfully linked to one of the only websites that tackled the question.

This reasoning sounds almost circular in the context of what A.I. answers are supposed to do. Google loves demonstrating how users can enter a query like “suggest a 7 day meal plan for a college student living in a dorm focusing on budget friendly and microwavable meals” and see a grouped set of responses synthesized from a variety of sources. That is surely a relatively uncommon query. I was going to prove that in the same was as Reid did, but when I enter it in Google Trends, I get a 400 error. Even a shortened version is searched so rarely it has no data.

The organic, non-A.I. search results for the long query are plentiful but do not exactly fulfill its specific criteria. Most of the links I saw are not microwave-only, or are simple lists not grouped into particular meal types. Nothing I could find specifically answers the question posed. In order to fulfill the query in the demo video, Google’s search engine has to look through everything it knows and find meals which cook in a microwave, and organize them into a daily plan of different meal types.

But Google is also blaming the novelty of the rocks query and the satirical information directly answering it for the failure of its A.I. features. In other words, it wants to say cool thing about its A.I. stuff is that it can handle unpopular or new queries by sifting through the web and merging together a bunch of stuff it finds. The bad thing about A.I. stuff, it turns out, is basically the same.

Benj Edwards, Ars Technica:

Here we see the fundamental flaw of the system: “AI Overviews are built to only show information that is backed up by top web results.” The design is based on the false assumption that Google’s page-ranking algorithm favors accurate results and not SEO-gamed garbage. Google Search has been broken for some time, and now the company is relying on those gamed and spam-filled results to feed its new AI model.

Reid says Google has made a bunch of changes to address the issues raised, but none of them fix a fundamental shift in A.I. results. Google used to be a directory — admittedly one ranked by mysterious criteria — allowing users to decide which results best fit their needs. It has slowly repositioned itself to being able to answer their queries with authority. Its A.I. answers are a more fulsome realization of features like Featured Snippets and the Answer Box. That is: instead of seeing options which may match their query, Google is now giving searchers singular answers. It has transformed from a referrer into an omniscient responder.