An AI-generated image of a glowing server farm illustrates how the new AI computers will require far more electricity to operate than their predecessors. (Source: Adobe Stock)

Researchers Warn that AI Electricity Requirements Could Increase Dramatically

One of the major issues with AI that is being somewhat overlooked is how to power all of these new AI bots and platforms. AI is experiencing a real boom in today’s markets all around the world.

It seems as if every day we are informed of a new way that AI is being used in almost every profession. From predicting the weather to answering the phone when you are ordering pizza, AI is becoming part of even smaller companies. Now we all know AI is powered with electricity and that is becoming a big problem.

The analysis in this article was conducted by Alex de Vries, a data scientist at the Central Bank of the Netherlands and a Ph.D. candidate at Vrije University Amsterdam, where he studies the energy costs of emerging technologies.

Scientific American conducted an interesting interview with the data scientist and we have shared certain parts of the interview with you below. First de Vries discusses the incredible amount of power that is needed.

More Power Required

Every online interaction accesses information stored in remote servers that gobble up electricity. Data centers use 1 to 1.5% of global electricity, according to the International Energy Agency. AI could consume far more.

“Researchers have been raising general alarms about AI’s hefty energy requirements over the past few months. But a peer-reviewed analysis published this week in Joule is one of the first to quantify the demand that is quickly materializing. A continuation of the current trends in AI capacity and adoption is set to lead to NVIDIA shipping 1.5 million AI server units per year by 2027. These 1.5 million servers, running at full capacity, would consume at least 85.4 terawatt-hours of electricity annually—more than what many small countries use in a year, according to the new assessment.”

Keeping AI Cool

The energy we need for our everyday computers is immense. But, we haven’t even touched on the cooling process that is needed to run larger AI platforms. It is here de Vries finds a lack of data to apply to his estimations. A big unknown is the location where AI servers are going to end up. That matters a whole lot, because if they’re at Google, then the additional cooling energy use is going to be somewhere in the range of a 10 percent increase.

“But global data centers, on average, will add 50 percent to the energy cost just to keep the machines cool. There are data centers that perform even worse than that.”

In the end, de Vries says the amount of electricity needed will be the guiding force in AI growth. You can’t use AI without electricity to power AI.  According to de Vries:

“What do I think is the most likely path going forward? I think the answer is that there’s going to be a growth in AI-related electricity consumption. At least initially, it’s going to be somewhat slow. But there’s the possibility that it accelerates in a couple of years as server production increases. Knowing this gives us some time to think about what we’re doing.”

This issue doesn’t even take into consideration how cryptocurrencies are gobbling up electricity at alarming rates. The big question remains: Where will we find the power we need for AI’s explosive growth?

read more at scientificamerican.com