“Each of these Nvidia servers, they are power-hungry beasts.”
All That Power
AI chatbots like OpenAI’s ChatGPT and Google’s Bard consume an astronomical amount of electricity and water — or, more precisely, the massive data centers that power them do.
And according to the latest estimates, those energy demands are rapidly ballooning to epic proportions.
In a recent analysis published in the journal Joule, data scientist Alex de Vries at Vrije Universiteit Amsterdam in the Netherlands found that by 2027, these server farms could use anywhere between 85 to 134 terawatt hours of energy per year.
That’s roughly on par with the annual electricity use of Argentina, the Netherlands, or Sweden, as the New York Times points out, or 0.5 percent of the entire globe’s energy demands. Sound familiar? The much-lampooned crypto industry spiked past similar power consumption thresholds in recent years.
It’s a massive carbon footprint that experts say should force us to reconsider the huge investments being made in the AI space — not to mention the eye-wateringly resource-intensive way that tech giants like OpenAI and Google operate.
We Hunger
Coming to an exact figure is difficult, since AI companies like OpenAI are secretive about their energy usage. De Vries settled on estimating their consumption by examining the sales of Nvidia A100 servers, which make up an estimated 95 percent of the AI industry’s underlying infrastructure.
“Each of these Nvidia servers, they are power-hungry beasts,” de Vries told the NYT.
It’s a worrying trend that’s leading some experts to argue that we should take a step back and reevaluate the trend.
“Maybe we need to ideally slow a bit down to start applying solutions that we have,” Roberto Verdecchia, an assistant professor in the University of Florence, told the newspaper. “Let’s not make a new model to improve only its accuracy and speed. But also, let’s take a big breath and look at how much are we burning in terms of environmental resources.”
Many companies operating in California in particular may face opposition earlier than you’d think. Over the weekend, California governor Gavin Newsom signed two major climate disclosure laws, forcing companies like OpenAI and Google, among roughly 10,000 other firms, to disclose how much carbon they produce by 2026.
Even with increased scrutiny from regulators, the space is still largely governing itself, and AI companies will likely continue to burn through copious amounts of energy to keep their models going.
There is, however, a financial incentive to lower these costs through technological advances, given the current burn rate. And considering the massive environmental footprint, any breakthroughs can’t come soon enough.
More on AI chatbots: AI Chatbots Are Only Useful If You Think They Are, Scientists Find
Share This Article