Apple Tells Parler to Moderate Better or Lose Its App by Tomorrow ⇥ buzzfeednews.com
Ryan Mac and John Paczkowski, Buzzfeed News:
In an email sent this morning and obtained by BuzzFeed News, Apple wrote to Parler’s executives that there had been complaints that the service had been used to plan and coordinate the storming of the US Capitol by President Donald Trump’s supporters on Wednesday. The insurrection left five people dead, including a police officer.
“We have received numerous complaints regarding objectionable content in your Parler service, accusations that the Parler app was used to plan, coordinate, and facilitate the illegal activities in Washington D.C. on January 6, 2021 that led (among other things) to loss of life, numerous injuries, and the destruction of property,” Apple wrote to Parler. “The app also appears to continue to be used to plan and facilitate yet further illegal and dangerous activities.”
Apple gave Parler a day from when it sent its letter to submit a new version of the app alongside a moderation policy. Google did not wait; it pulled the app from the Play Store this afternoon.
From Apple’s letter, as quoted in the article:
Your CEO was quoted recently saying “But I don’t feel responsible for any of this and neither should the platform, considering we’re a neutral town square that just adheres to the law.” We want to be clear that Parler is in fact responsible for all the user generated content present on your service and for ensuring that this content meets App Store requirements for the safety and protection of our users. We won’t distribute apps that present dangerous and harmful content.
For what it is worth, it will still be possible to post to Parler from its website even if these apps are removed. It is not as though Parler does not exist on the iPhone after tomorrow when, inevitably, the ostensibly unmoderated platform fails to produce a tighter moderation strategy.
This clearly relates to questions about whether it is fair that users’ native software choices on the iPhone are limited by Apple’s control over the platform and its only software distribution mechanism. It seems reasonable to me that Apple would choose not to provide a platform for apps that have little to no moderation in place. Both Apple and Google disallowed clients for Gab — Twitter but for explicit Nazis — in their respective stores. Apple rejected the app at submission time, while Google permitted it and then pulled it:
Google explained the removal in an e-mail to Ars. “In order to be on the Play Store, social networking apps need to demonstrate a sufficient level of moderation, including for content that encourages violence and advocates hate against groups of people,” the statement read. “This is a long-standing rule and clearly stated in our developer policies. Developers always have the opportunity to appeal a suspension and may have their apps reinstated if they’ve addressed the policy violations and are compliant with our Developer Program Policies.”
Gab now runs on Mastodon, which is a decentralized standard that allows different communities to moderate posts as they choose. There are many Mastodon clients in the App Store, likely because there is not really a singular Mastodon product as much as there are many posts collected through a standard format.