Natasha Lomas, TechCrunch:
Thing is, no one is asking Facebook for perfection, Mark. We’re looking for signs that you and your company have a moral compass. Because the opposite appears to be true. (Or as one UK parliamentarian put it to your CTO last year: “I remain to be convinced that your company has integrity”.)
Facebook has scaled to such an unprecedented, global size exactly because it has no editorial values. And you say again now you want to be all things to all men. Put another way that means there’s a moral vacuum sucking away at your platform’s core; a supermassive ethical blackhole that scales ad dollars by the billions because you won’t tie the kind of process knots necessary to treat humans like people, not pairs of eyeballs.
You don’t design against negative consequences or to pro-actively avoid terrible impacts — you let stuff happen and then send in the ‘trust & safety’ team once the damage has been done.
You might call designing against negative consequences a ‘growth bottleneck’; others would say it’s having a conscience.
I don’t think it makes sense to make Facebook legally culpable for everything that is posted by users on their platform. There’s a good reason Section 230 was created, and I think FOSTA is unproductive and ultimately harmful. But while I don’t think fines and legal penalties should rain down on the company for jackass users’ contributions, I do think that Facebook should be morally responsible. It is clear to me that Facebook was blinded by its own massive growth that it couldn’t or didn’t want to handle community health problems, and instead focused on putting out PR fires. Zuckerberg’s Journal op-ed is another exercise in public relations — you can tell that it was largely written by lawyers and communications personnel rather than his own hand — and not anything truly meaningful.