AI is already consuming more energy than all EV vehicles!

AI is already consuming more energy than all EV vehicles!

Introduction

Someone recently suggested that AI, including the mere training of large language models (LLMs), is consuming more energy than all-electric cars combined. This statement surprised me, and I wanted to run a few numbers to validate it.

The maths

Let's contrast the EV situation with data centres at a macro level, knowing that the largest EV deployment has been in the US so far and that the highest use of data centers for AI has been linked to US companies. So all in all, what is happening in the US seems to be a fair comparison to what may end-up happening in the rest of the world. A typical trend when it comes to new tech...

1/ EV energy footprint in the US alone

In the US alone, 3.5M pure EV vehicles were circulating in 2023 (or 7% of the active car installed-base with a ~20% anticipated CAGR).

The energy needed per EV for an average use is estimated to be around 3000-4000kWh/year. Multiply by the number of cars, the total comes to 14 terawatt-hours (10 billion kWh).

Given that an average nuclear plant generates ~8TWh per year, you can picture that one or two nuclear plants are needed to sustain the US EV market alone.

To give a few reference points, the US uses ~50 nuclear power plants, contributing to 20% of the country's energy, while the annual energy needs for a city like New York is around 50TWh.

Assuming a 20% growth rate in EV penetration, this means that by 2030, the EV market will represent the energy consumption equivalent to a city like New York (or ~50/8*20%=2-3 % of the energy consumed in the US).

2/ AI and its impact on data centers

Now, public information suggests that the training of OpenAI's ChatGPT GPT-3 model alone required 1295 MWh of energy.

Knowing that a high-end GPU consumes 400 Watts/h (compared to 15Wh for a cell phone) and requires tens of thousands of processors running for a few months to materialize a new LLM version, this number isn't surprising and growing as the models get larger (GPT-4 has ~2 trillion parameters to tune, a 10X increase compared to GPT-3).

Using simple assumptions, like assuming that the next generation of GPTx will use 0.01 TWh of GPU energy (~10x more than GPT4), add the cost of cooling all the electronics involved (some say it can be up to 30%), add all the "trial" runs and R&D "exploration" time for new models before a multi-month "training" session is launched (which can run over a 2-3 year period), one can easily see that 0.2-0.5TWh per new generation of LLMs is indeed in the realm of possibilities.

Multiply by the number of large LLM companies (a dozen today), consider that they are training multiple large LLMs concurrently and remember that once the model is trained, you still need energy to run inference instances of those models (ChatGPT receive 1.8 billion visitors per month, each running multiple queries).

So yes, it is already very likely that AI is representing more energy consumed than the burgeoning electrification of the car industry.

Some reports even suggest that the aggregated WW demand to run datacenters, bitcoin, and AI is in the 400TWh range (here ).

Conclusion

Given the above context, it's no wonder that energy optimization in data centers is the subject of much research. Add to it the shortage of AI-optimized processor solutions, and you understand why a lot of investments are heading towards a new generation of more energy-friendly chipsets and better AI algorithms.

As for EV vehicles, it's also encouraging to see that battery technologies are continuously evolving with the ever-increasing ratio between range and energy density. The law of physics allows a car to re-absorb energy as the vehicle decelerates, which is a huge advantage for these machines.

A summary of all my articles can be found here .

#energy #EVs #AI #datacenters


As always, feel free to contact me @ [email protected] if you have comments or questions about this article. Please note that my articles reflect personal opinions on various topics and are always based on publicly available data.

More at www.lohier.com and also my book . You can subscribe to this free bi-weekly newsletter here and access former editions here .


Thanks a lot Frantz Lohier , very insightful ??

回复

要查看或添加评论,请登录