Pixel Envy

Written by Nick Heer.

Tools Don’t Solve the Web’s Problems

Peter-Paul Koch:

The web definitely has a speed problem due to over-design and the junkyard of tools people feel they have to include on every single web page. However, I don’t agree that the web has an inherent slowness. The articles for the new Facebook feature will be sent over exactly the same connection as web pages. However, the web versions of the articles have an extra layer of cruft attached to them, and that’s what makes the web slow to load. The speed problem is not inherent to the web; it’s a consequence of what passes for modern web development. Remove the cruft and we can compete again.

Oh, yes, please.

This happens on the client side from the inclusion of Javascript frameworks, external plugins, analytics scripts,1 giant images, and so forth; each of these requires a DNS query, a download, and potentially rendering. This cruft also exists on the server side from related content and similar extraneous database lookups. It gets worse: the creeping of this cruft coincided with the rise of the responsive web, which means that all this crap gets served over your metered cellular connection.

Somewhere in my Pinboard,2 I have a series of links to Stack Overflow threads where someone asks a question solvable with basic CSS, yet the top-ranked answer involves a jQuery plugin or two, and a custom script. It’s atrocious.

But this cruft keeps creeping in because typical web connections are — broadly speaking — getting faster, so it’s somehow okay in the minds of some to send increasing amounts of data. Actual, real speed in lieu of client-side caching seems to no longer be a priority. And that’s why the web is slow: not because Facebook is doing anything that special, but because few people put in the effort to make it fast.


  1. If you have Ghostery installed, you know just how many tracker scripts are on so many websites. ↩︎

  2. I need to start using tags again. ↩︎