Evelyn Douek, Wired:
There are many important questions that could be asked at the Senate Judiciary Committee hearing with tech CEOs today regarding their handling of the US 2020 election. Foremost among them should be “Where is Susan Wojcicki, YouTube’s CEO?” The election was billed as a major test for social media platforms, but it’s one that YouTube failed weeks before election day. The platform is playing host to, and is an important vector for, spreading false claims of election victory and attempts to delegitimize Biden’s win. YouTube had to have seen it all coming, and it shrugged. That’s YouTube’s fault — but it’s also a result of the success of its broader strategy to keep its head down and let other platforms be the face of the content moderation wars. In general, the media, researchers, and lawmakers have let this strategy work.
Forget misinformation and extremism for a moment — which is a wild sentence, I know. Think how strange it is that Wojcicki has escaped virtually all antitrust scrutiny.
It is impossible to recognize today’s internet without YouTube. It is the video hosting site on the web, to the extent that the company has made it suck to use with no consequences. Five hundred hours of video are uploaded to the site every minute. Just about everyone who has an acceptable connection uses YouTube. Yet, despite the huge footprint and power held by YouTube, Wojcicki has never been called before the U.S. Congress on antitrust issues.1
Is that solely because it is owned by Google, and Sundar Pichai has testified? It must be. But if it is an objective to find the best person to testify about these things, it should be the CEO. YouTube is a separate company owned by Google, and Wojcicki is its CEO. She ought to testify.
I also think this, from Douek’s article, is well-put:
This is not a call for a swath of new policies banning any and all false political content (whatever that would mean). In general, I favor intermediate measures like aggressive labelling, de-amplification, and increased friction for users sharing it further. But most of all, I favor platforms taking responsibility for the role they play in our information ecosystem, thinking ahead, being transparent, explaining their content moderation choices, and showing how they have been enforced. Clear policies, announced in advance, are an important part of platform governance: Content moderation must not only be done, but it must be seen to be legitimate and understood.
This is a very good editorial.
YouTube’s power primarily affects Creators — those who earn ad revenue from video views — and advertisers, and that, in turn, affects users. I have always wondered how it is possible that YouTube is the only major general video hosting platform ↩︎