Christopher Mims, Wall Street Journal:
Services are also attempting to reduce the content-moderation load by reducing the incentives or opportunity for bad behavior. Pinterest, for example, has from its earliest days minimized the size and significance of comments, says Ms. Chou, the former Pinterest engineer, in part by putting them in a smaller typeface and making them harder to find. This made comments less appealing to trolls and spammers, she adds.
The dating app Bumble only allows women to reach out to men. Flipping the script of a typical dating app has arguably made Bumble more welcoming for women, says Mr. Davis, of Spectrum Labs. Bumble has other features designed to pre-emptively reduce or eliminate harassment, says Chief Product Officer Miles Norris, including a “super block” feature that builds a comprehensive digital dossier on banned users. This means that if, for example, banned users attempt to create a new account with a fresh email address, they can be detected and blocked based on other identifying features.
No matter how effectively platforms become at removing unwanted and inappropriate media, it will always be preferable to me for these services and products to be designed to discourage the need for heavy moderation in the first place. It is unsurprising to me that the platforms taking this approach and highlighted here by Mims are used by women more than, say, Twitter, Reddit, or YouTube. I have long harboured a pet theory that it is a positive feedback loop due, in part, to considering the negative ramifications of specific features. These platforms are certainly not perfect, but their more thoughtful feature design means they are less prone to misuse, which means they are more appealing to women and other people who are more likely to face abuse online. By contrast, platforms that deploy features without that kind of foresight quickly become overwhelmed with misuse, driving away some of those who tend to be on its receiving end.
Anyway, that is just a little speculation.