Evelyn Douek, the Atlantic:

Recent Senate hearings — convened under the banner of “Protecting Kids Online” — focused on a whistleblower’s revelations regarding what Facebook itself knows about how its products harm teen users’ mental health. That’s an important question to ask. But if there’s going to be a reckoning around social media’s role in society, and in particular its effects on teens, shouldn’t lawmakers also talk about, um, the platforms teens actually use? The Wall Street Journal’s “Facebook Files” reports, after all, also showed that Facebook itself is petrified of young people abandoning its platforms. To these users, Facebook just isn’t cool.

So TikTok is not a passing fad or a tiny start-up in the social-media space. It’s a cultural powerhouse, creating superstars out of unknown artists overnight. It’s a career plan for young influencers and a portable shopping mall full of products and brands. It’s where many young people get their news and discuss politics. And sometimes they get rowdy: In June 2020, TikTok teens allegedly pranked then-President Donald Trump’s reelection campaign by overbooking tickets to a rally in Tulsa, Oklahoma, and then never showing.

TikTok is an unmitigated sensation, and the best argument made by those who insist that Facebook’s acquisitions of Instagram and WhatsApp have not meaningfully diminished competition in the social media space.

Its privacy and moderation policies are also worrying. Though similar to policies for platforms created in the U.S. and elsewhere, TikTok moderators have also censored videos, and there is more (emphasis mine):

[…] The platform’s content moderation is opaque, but there are plenty of reasons to be concerned: It has suppressed posts of users deemed ugly, poor, or disabled; removed videos on topics that are politically sensitive in China; and automatically added beauty filters to users’ videos. The “devious licks” challenge, which prompted kids to remove soap dispensers in schools, might sound comical, but school administrators aren’t laughing. Connecticut’s attorney general wants to know what’s going on with the “slap a teacher” dare, although TikTok says that’s not its fault.

The last claim is something that appears to be invented or at least exaggerated by a media that cannot get enough of the latest teen trend.

One thing that is certainly concerning is TikTok’s ability to steer users deeper into niche video categories. Like many other things here, this is not unique to TikTok — YouTube is notorious for a recommendation system that used to push users down some pretty dark paths.

An investigation by the Wall Street Journal this summer found that TikTok primarily uses the time spent watching each video to signal what users are most interested in. That weighting is a clever decision in its simplicity. Interacting with something on any platform by liking it, re-sharing it, or commenting on it requires a deliberate effort, and it is often public. Those actions tell a recommendation algorithm what we are comfortable showing other people what we are interested in. But the amount of time we spend looking at something is a far more valuable metric about what captivates us most.

Which is kind of creepy when you think about it.

The fact that our base instincts are revealed by how often we rubberneck at the site of a car accident will, unsurprisingly, create pathways to mesmerizing but ethically dubious videos. A Journal investigation last month found that demo user accounts that appeared to be aged 13–15 were quickly directed to videos about drinking, drug use, and eating disorders, as well as those from users who indicated their videos were for an adult audience only.

I get why this is alarming, but I have to wonder how different it is from past decades’ moral panics. Remember the vitriol expressed against Marilyn Manson in the 2000s for his music? Parents ought to have saved up that anger for now, when it really matters. Rap and hip hop have been blamed for all kinds of youth wrongdoing, as have MTV, television, and the internet more broadly. Is there something different about hearing and seeing this stuff in video form instead of in song lyrics or on message boards?

I think this recent Media Matters study by Olivia Little and Abbie Richards is a better illustration of the social failure of TikTok’s recommendations engine:

After we interacted with anti-trans content, TikTok’s recommendation algorithm populated our FYP [For You Page] feed with more transphobic and homophobic videos, as well as other far-right, hateful, and violent content.

Exclusive interaction with anti-trans content spurred TikTok to recommend misogynistic content, racist and white supremacist content, anti-vaccine videos, antisemitic content, ableist narratives, conspiracy theories, hate symbols, and videos including general calls to violence.

That looks like a pathway to radicalization to me, especially for users in balkanized and politically fragile regions, or places with high levels of anxiety. That seems to describe much of the world right now.