Meta Must Face Youth Addiction Lawsuit by Massachusetts, Court Rules ⇥ reuters.com
Nate Raymond, Reuters
Meta Platforms must face a lawsuit by Massachusetts’ attorney general alleging the company designed its Instagram social media platform to addict children, the state’s top court ruled on Friday.
[…]
Writing for the unanimous court, Justice Dalila Argaez Wendlandt said the lawsuit brought by Massachusetts Attorney General Andrea Joy Campbell does not seek to hold Meta liable for content created by its users — which Section 230 of the Communications Decency Act of 1996 generally shields companies from — but targets the company’s conduct.
The Electronic Privacy Information Center:
In this case, as in many recent ones, the Court found that Section 230 does not prohibit claims alleging the companies designed their platforms harmfully and lied about their activities. Meta pushed its typical Section 230 test, claiming the law preempts any claim premised on Meta’s publishing activity. But the Court corrected Meta: Section 230 only applies to claims seeking to hold Meta liable for the harms springing directly from user-generated content they post. Meta’s design decisions, by contrast, are its own responsibility.
Mike Masnick, Techdirt:
This ignores a long list of precedents — and the explicit statements of Section 230’s authors — establishing that the law was designed to protect platforms from being sued over any editorial decision-making, including how content is presented. To put this in perspective, it’s like saying that someone could sue, say, the evening news based on where they placed a story (top of the show or bottom?) and that the impact of how it was presented is somehow unrelated to the content itself. That makes no sense. But it’s the way this court has interpreted 230.
Even if this opinion doesn’t outright eliminate Section 230 in Massachusetts, it’s a sign of how 230 workarounds keep proliferating, contributing to the swiss cheese-ification of Section 230. When the bubbles in the swiss cheese become too large, the cheese wedge lacks structural integrity and falls apart. That is where 230 is heading, if it’s not already there.
Goldman is a lawyer and is worried about cases like these; the recent child safety cases in California and New Mexico also caused great concern.
To me, a non-lawyer, much of the actual text of the ruling (PDF) explaining why this lawsuit was not immediately turfed on Section 230 grounds seems pretty reasonable. For example, the judge says “[u]nder the default settings, Meta enables approximately forty types of notifications” for the Instagram app, which the government alleges “is designed to overwhelm young users and compel them repeatedly to reopen Instagram”. We can argue whether this is a meaningful thing for a government to police or if it is just another example of Meta resorting to tacky growth-hacking techniques instead of trusting their product is sufficiently compelling on its own. (Most days when I open Instagram in my browser, it puts a red badge over the notifications tab and suggests I have one new follower. I do not; I never have. It lies to me every time, presumably because it knows most people, including me, will usually click on that, thereby increasing a number on a dashboard somewhere.)
The government also raises issue with autoplay, infinite scrolling, live videos, and disappearing stories as potential vectors for harm. Whether this is true or false is immaterial to whether someone should have legal standing to make the argument in court. I, a non-lawyer, do not see why Section 230 should insulate companies from their product design choices simply because they occur on the internet. There is a tantalizing reference to Meta “deliberately manufacturing a delay between” a user refreshing their feed and new posts being displayed to, supposedly, heighten anticipation. Whether this is as described is something that can be scrutinized in court — but only if the government is allowed to make that case.
It entirely makes sense to me for a company like Meta to face no legal liability for the substance of a user’s post, like if an Instagram user baselessly accuses someone of a crime in a video they post. It is the person making that claim who should face legal consequences. But extending this legal moratorium to all facets of a platform containing user-generated material seems — as a non-lawyer with only a little bit of background knowledge — too far. I trust experts, but I am not following their logic that this would effectively repeal Section 230 and all the ways in which it has given birth to the modern web.
One thing is certain: given that many internet companies are headquartered in the United States, it is wild that a single ruling by a court in Massachusetts — a tiny state with a population of about seven million — could conceivably change the way the web works for just about everyone around the world.