Facebook’s Design at Scale Means Every Decision Has Extraordinary Consequence ⇥ warzel.substack.com
I’ve come to believe that arguments weighing Facebook’s good and bad outcomes are probably a dead end. What seems rather indisputable is that as currently designed (to optimize scale, engagement, profit) there is no way to tweak the platform in a way that doesn’t ultimately make people miserable or that destabilizes big areas of culture and society. The platform is simply too big. Leave it alone and it turns into a dangerous cesspool; play around with the knobs and risk inadvertently censoring or heaping world historic amounts of attention onto people or movements you never anticipated, creating yet more unanticipated outcomes. If there’s any shred of sympathy I have for the company, it’s that there don’t seem to be any great options.
I think there are plenty of overwrought claims about Facebook that are really not about Facebook and mostly about scoring political points. It can feel performative when people say things like “Facebook is not compatible with democracy.” But I do believe that Facebook, at its current scale and in its current design, is not really compatible with humanity.
Working through the Wall Street Journal’s Facebook Files series this week has been an educational experience. These articles are chock full of evidence from inside the company effectively proving what has long been assumed externally: that it has all of the research and data to show the dangers of its platform, yet attempts to control for them are either shot down for profit reasons or, if implemented, cause unintended consequences that are just as bad. The world coalesced around Facebook’s properties as a primary communications channel and we are worse for it — but we struggle to turn away.