The Energy Footprint of A.I. ⇥ technologyreview.com
Casey Crownhart, MIT Technology Review:
Today, new analysis by MIT Technology Review provides an unprecedented and comprehensive look at how much energy the AI industry uses — down to a single query — to trace where its carbon footprint stands now, and where it’s headed, as AI barrels towards billions of daily users.
We spoke to two dozen experts measuring AI’s energy demands, evaluated different AI models and prompts, pored over hundreds of pages of projections and reports, and questioned top AI model makers about their plans. Ultimately, we found that the common understanding of AI’s energy consumption is full of holes.
This robust story comes on the heels of a series of other discussions about how much energy is used by A.I. products and services. Last month, for example, Andy Masley published a comparison of using ChatGPT against other common activities. The Economist ran another, and similar articles have been published before. As far as I can tell, they all come down to the same general conclusion: training A.I. models is energy-intensive, using A.I. products is not, lots of things we do online and offline have a greater impact on the environment, and the current energy use of A.I. is the lowest it will be from now on.
There are lots of good reasons to critique artificial intelligence. I am not sure its environmental impact is a particularly strong one; I think the true energy footprint of tech companies, of which A.I. is one part, is more relevant. Even more pressing, however, is our need to electrify our world as much as we can, and that will require a better and cleaner grid.