Lesley Stahl of CBS’ 60 Minutes interviewed Susan Wojcicki about the state of YouTube:
And what about medical quackery on the site? Like turmeric can reverse cancer; bleach cures autism; vaccines cause autism.
Once you watch one of these, YouTube’s algorithms might recommend you watch similar content. But no matter how harmful or untruthful, YouTube can’t be held liable for any content, due to a legal protection called Section 230.
Lesley Stahl The law under 230 does not hold you responsible for user-generated content. But in that you recommend things, sometimes 1,000 times, sometimes 5,000 times, shouldn’t you be held responsible for that material, because you recommend it?
Susan Wojcicki Well, our systems wouldn’t work without recommending. And so if—
Lesley Stahl I’m not saying don’t recommend. I’m just saying be responsible for when you recommend so many times.
Susan Wojcicki If we were held liable for every single piece of content that we recommended, we would have to review it. That would mean there’d be a much smaller set of information that people would be finding. Much, much smaller.
I entirely buy the near-impossibility of moderating a platform where hundreds of hours of video are uploaded every second. It seems plausible that uploads could be held for initial machine review, with a human-assisted second stage — particularly for new accounts — but that’s kind of nitpicking at YouTube’s scale. It would not be preferable to hold YouTube legally accountable for the videos users upload.
However, I do not buy for one second that YouTube should not be held morally accountable for the videos it recommends. The process and intention of recommendations is entirely in YouTube’s hands, and they can adjust it as they choose. Watching a video from a reputable newspaper should not suggest a video from a hate group in its “Up Next” feature. Conspiracy theories should not be the first search result, for example; they should be far harder to find. YouTube clearly agrees, and have been making changes as a result. But it isn’t enough. It’s misleading to paint uploads and recommendations with the same brush, and it is worrying that a lack of legal obligations is used to justify moral inaction.