Third Circuit’s Section 230 TikTok Ruling techdirt.com

Maryclaire Dale, Associated Press:

A U.S. appeals court revived on Tuesday a lawsuit filed by the mother of a 10-year-old Pennsylvania girl who died attempting a viral challenge she allegedly saw on TikTok that dared people to choke themselves until they lost consciousness.

While federal law generally protects online publishers from liability for content posted by others, the court said TikTok could potentially be found liable for promoting the content or using an algorithm to steer it to children.

Notably, the “Blackout Challenge” or the “Choking Game” is one of few internet challenges for teenagers which is neither a media-boosted fiction nor relatively harmless. It has been circulating for decades, and was connected with 82 deaths in the United States alone between 1995–2007. Which, yes, is before TikTok or even social media as we know it today. Melissa Chan reported in a 2018 Time article that its origins go back to at least the 1930s.

Mike Masnick, of Techdirt, not only points out the extensive Section 230 precedent ignored by the Third Circuit in its decision, he also highlights the legal limits of publisher responsibility:

We have some caselaw on this kind of thing even outside of the internet context. In Winter v. GP Putnam’s Sons, it was found that the publisher of an encyclopedia of mushrooms was not liable for “mushroom enthusiasts who became severely ill from picking and eating mushrooms after relying on information” in the book. The information turned out to be wrong, but the court held that the publisher could not be held liable for those harms because it had no duty to carefully investigate each entry.

Matt Stoller, on the other hand, celebrates the Third Circuit’s ruling as an end to “big tech’s free ride on Section 230”:

Because TikTok’s “algorithm curates and recommends a tailored compilation of videos for a user’s FYP based on a variety of factors, including the user’s age and other demographics, online interactions, and other metadata,” it becomes TikTok’s own speech. And now TikTok has to answer for it in court. Basically, the court ruled that when a company is choosing what to show kids and elderly parents, and seeks to keep them addicted to sell more ads, they can’t pretend it’s everyone else’s fault when the inevitable horrible thing happens.

And that’s a huge rollback of Section 230.

On a legal level, both Masnick and Stoller agree the Third Circuit’s ruling creates a massive change in U.S. internet policy and, because of current structures, the world. But their interpretations of this are in vehement disagreement on whether this is a good thing. Masnick says it is not, and I am inclined to agree. Not only is there legal precedent on his side, there are plenty of very good reasons for why Section 230 is important to preserve more-or-less the way it has existed for decades.

However, it seems unethical for TikTok to have no culpability for how users’ dangerous posts are recommended, especially to children. Perhaps legal recourse is wrong in this case and others like it, yet it just feels wrong for this case to eventually — after appeals and escalation to, probably, the Supreme Court — be summarily dismissed on the grounds that corporations have little responsibility or care for automated recommendations. There is a real difference between teenagers spreading this challenge one-on-one for decades and teenagers broadcasting it — or, at least, there ought to be a difference.