Twitter’s Abuse Problem Comes Down to a Failure of Leadership and a Reliance on Algorithms ⇥ techcrunch.com
Natasha Lomas, TechCrunch:
Twitter has clearly not fixed the problem of abuse on its platform — and very clearly also continues to fail to fix the problem of abuse on its platform.
Leaning on algorithms to do this vital work appears to be a large part of this failure.
But not listening to the users who are being abused is a even greater — and more telling — lapse of leadership.
There’s an enormous disconnect between what tech companies feel compelled to restrict and what users feel is worth restricting. The New York Times illustrates this today with an interactive feature about what Facebook considers hate speech worthy of removal. The second phrase — “Poor black people should still sit at the back of the bus.” — would likely not be considered hate speech on its own by Facebook’s standards:
While Facebook’s training document lists any call for segregation as an unacceptable attack, subsets of protected groups do not receive the same protection, according to the document. While race is a protected category, social class is not, so attacks targeting “poor black people” would not seem to qualify as hate speech under those rules, Ms. Citron said. That is because including social class in the attack negates the protection granted based on race.
As of right now, 93% of over 60,000 Times readers think that statement constitutes hate speech, and I think most reasonable people would agree on that: the historical connotations of forcing black people to sit at the back of a bus far overwhelm the income status of the subject. Surely there’s enough context within that single phrase to establish that it’s driven by race, right?
But this is the thing: tech companies are generally run by people who are not subjected to abuse or targeted hate speech on their platforms. It would be prudent of them to take seriously the concerns raised by affected users. But this is also another reason why executive teams need to comprise more diverse perspectives because, as far more eloquent writers have pointed out, not doing so creates a huge blind spot.
Tech companies need to mature to a point where they recognize the responsibility they have to the billions of people on this planet, because that’s the scale they operate at now.