Pixel Envy

Written by Nick Heer.

Notes From an Emergency

Maciej Cegłowski, in an infinitely-quotable transcript from a talk he gave at Republica Berlin:

The danger facing us is not Orwell, but Huxley. The combo of data collection and machine learning is too good at catering to human nature, seducing us and appealing to our worst instincts. We have to put controls on it. The algorithms are amoral; to make them behave morally will require active intervention.

The second thing we need is accountability. I don’t mean that I want Mark Zuckerberg’s head on a pike, though I certainly wouldn’t throw it out of my hotel room if I found it there. I mean some mechanism for people whose lives are being brought online to have a say in that process, and an honest debate about its tradeoffs.

Cegłowski points out, quite rightly, that the data-addicted tech industry is unlikely to effectively self-regulate to accommodate these two needs. They’re too deeply-invested in tracking and data collection, and their lack of ethics has worked too well from a financial perspective.

Cegłowski, again:

But real problems are messy. Tech culture prefers to solve harder, more abstract problems that haven’t been sullied by contact with reality. So they worry about how to give Mars an earth-like climate, rather than how to give Earth an earth-like climate. They debate how to make a morally benevolent God-like AI, rather than figuring out how to put ethical guard rails around the more pedestrian AI they are introducing into every area of people’s lives.

The tech industry enjoys tearing down flawed institutions, but refuses to put work into mending them. Their runaway apparatus of surveillance and manipulation earns them a fortune while damaging everything it touches. And all they can think about is the cool toys they’ll get to spend the profits on.

The message that’s not getting through to Silicon Valley is one that your mother taught you when you were two: you don’t get to play with the new toys until you clean up the mess you made.

I don’t see any advantage to having a regulated web. I do see advantages to having regulated web companies.

All of us need to start asking hard questions of ourselves — both as users, and as participants in this industry. I don’t think users are well-informed enough to be able to make decisions about how their data gets used. Even if they read through the privacy policies of every website they ever visited, I doubt they’d have enough information to be able to decide whether their data is being used safely, nor do I think they would have any idea about how to control that. I also don’t think many tech companies are forthcoming about how, exactly, users’ data is interpreted, shared, and protected.

Update: If you — understandably — prefer to watch Cegłowski speak, a video of this talk has been uploaded to YouTube. Thanks to Felix for sending me the link.