Kirsten Grind, Sam Schechner, Robert McMillan, and John West, Wall Street Journal:
Twenty years ago, Google founders began building a goliath on the premise that its search algorithms could do a better job combing the web for useful information than humans. Google executives have said repeatedly—in private meetings with outside groups and in congressional testimony—that the algorithms are objective and essentially autonomous, unsullied by human biases or business considerations.
The company states in a Google blog, “We do not use human curation to collect or arrange the results on a page.” It says it can’t divulge details about how the algorithms work because the company is involved in a long-running and high-stakes battle with those who want to profit by gaming the system.
But that message often clashes with what happens behind the scenes. Over time, Google has increasingly re-engineered and interfered with search results to a far greater degree than the company and its executives have acknowledged, a Wall Street Journal investigation has found.
Instead of blockbuster findings of manual intervention that favour specific viewpoints or political parties, though, it seems that the Journal completely botched this report.
The truth is, I spoke to a number of these Wall Street Journal reporters back in both March and April about this topic, and it was clear then that they had little knowledge about how search worked. Even a basic understanding of the difference between organic listings (the free search results) and the paid listings (the ads in the search results) eluded them. They seemed to have one goal: to come up with a sensational story about how Google is abusing its power and responsibility for self gain.
Google is not certainly perfect, but almost everything in the Wall Street Journal report is incorrect. I’ll go through many of the points below.
I think Schwartz’s piece says almost everything that needs to be explained about how badly the Journal got this one wrong, but I’ll add two additional observations:
The Journal has a neat feature where you can pick from the search queries they tested and compare Google’s results against those of Duck Duck Go and Bing. I like and use Duck Duck Go regularly, but it’s clear that Google’s results are often stronger for more vague search queries.
For example, a Google search for
Elizabeth Warrenalways resulted in links to Warren’s campaign website, her Wikipedia page, and her U.S. Senate page — note that the Journal does not preserve rankings but, instead, lists pages based on how often they appeared in results. However, the same query in Duck Duck Go returned different results: while her campaign website and U.S. Senate age also appeared 100% of the time, so, too, did her shop, plus a page on a website called “Married Wiki” which has the title “Elizabeth Warren wiki, affair, married, Lesbian with age”. This page of questionable reliability only appeared in 58% of the Journal’s tests when they tried with Bing, and none for Google.
More shocking and egregious is the way Duck Duck Go and Bing handle the query
How do I kill myself: both almost entirely list results that answer the question directly. That’s logical from a purely technical perspective, but it’s callous and uncaring compared to Google’s choice to show suicide prevention resources, including placing the National Suicide Prevention Line’s phone number (1-800-273-8255) in an information box above all other links.
The Journal cannot seem to decide how to frame this story — they frequently hint towards corporate malfeasance, partisanship, and underhandedness, but they never quite stick the landing and fall back on ways in which they simplified public statements to officials who lack any and all understanding of tech companies.
Part of this mongering is a result of speculation about Google’s ranking methodology, as the company deliberately keeps that a secret. Part of it is lack of understanding. And part of it is that there are people out there who are simply too drunk with ideological rage to see that the Gateway Pundit is not a reliable news source; it is completely legitimate for fiction to rank poorly in search queries related to current events.