The Real Energy Requirements of Artificial Intelligence

The Real Energy Requirements of Artificial Intelligence

There is a huge problem with generative AI that is not being discussed nearly enough. It isn’t data security, or the robots taking our jobs, it is energy consumption. Put bluntly, AI is an energy hog.

For instance, ChatGPT-3, not even the most current version of the popular generative AI model, uses 10 times more energy to run a query than a typical Google search. It makes me want to rethink how I use AI, and for what.?

Rene Haas, CEO of Arm Holdings, a British semiconductor and software design company, recently came out and made a statement about data center energy consumption that most people would find shocking.

He said, “by the end of the decade, AI data centers could consume as much as 20% to 25% of U.S. power requirements. Today that’s probably 4% or less.”

How does that align with your sustainability program?

To learn more about why AI requires so much energy, read or listen to The Surging Problem of AI Energy Consumption on Art of Procurement.


Putting AI Energy Consumption Into Context

As I mentioned, it takes 10 times as much energy to run a search in ChatGPT-3 as it takes to run a similar search on Google. How does that compare that to other energy conscious sensibilities??

I’m a NASCAR fan, so let’s go there.?

People are always critical of the fuel consumption associated with stock car racing. The cars get somewhere around 5 miles per gallon at race speed and 14-18 miles per gallon under caution. A street legal version of the same model Ford used in NASCAR gets about 20 mpg. That’s only a 4X difference. To get the same scale as has been reported about generative AI, the difference would have to be 5 mpg and 50 mpg.?

Alex Hernandez-Garcia, an AI expert that works as a researcher at Mila , an AI institute in Montreal, Canada has studied the carbon footprint of different AI models. In 2023, he wrote a paper that estimated the climate impact of training the large language model (LLM) GPT-3, positioning it as equal to about 450 commercial airplane flights between London and New York City.

(Side note: I’m no AI expert, but as a procurement professional I’m pretty good with units of measure. There are a lot of comparisons used to help contextualize how much energy AI consumes - but they are almost all measured in commercial airline flights. That unit of measure is not a great sign when you’re also concerned about sustainability.)

Pressure from China has everyone on pins and needles about efficient AI development. Rahm Emanuel , former Obama chief of staff and current Ambassador to Japan, has spoken about how important it is for the U.S. and Japan to work together to drive innovation in AI and quantum computing in order to thwart the threat from China.

Of course, he’s also famous for saying “You never let a serious crisis go to waste.”


Solving the Problem at Scale

If my recent conversations with providers, analysis, and decision makers have been any indication, turning away from AI probably isn’t realistic - not even to reduce energy demand. Fortunately, many ideas have been offered up as ways to solve the problem.


Save energy elsewhere - with AI’s help

Some have suggested that AI could solve its own problem , finding ways to offset power requirements by increasing efficiency elsewhere - starting with real estate.

Real estate is a huge source of wasted energy. According to the International Energy Agency , residential and commercial buildings account for 30 percent of global energy use and on average, one third of that goes to waste.?

U.S. commercial occupancy rates are at about half of what they were pre-COVID based on employee swipe-card data collected by Kastle Systems. Have power consumption related settings in those buildings been updated? Maybe not.

And don’t worry, “I’m from the government and I’m here to help,” to quote Ronald Regan .

Regulations have established surcharges for buildings that release more than the allotted amount of carbon through their operation. One example of this is New York City’s Local Law 97, which took effect in January 2024.

BUT…?

To be on track for the IEA’s net-zero strategy, the energy consumed by real estate must be reduced 25 percent by 2030. Unfortunately, energy consumption is currently growing by 1 percent annually. So... that is unlikely to be the answer we seek.

Fortunately, there are plenty of other suggestions as well.

Moderated consumption: Are you switching out technologies and processes that work perfectly well for ones that involve AI just to be ‘cool?'

Climate alignment: Perhaps thinking about where server farms are located (from a climate perspective) could reduce the amount of cooling (and therefore energy) required by data centers.

Data quality: Higher quality data reduces the energy consumption associated with AI training and model building (but we all know how elusive higher quality data is…)

Regulated disclosure: Alex de Vries , a data scientist at the central bank of the Netherlands and a Ph.D. candidate at Vrije University Amsterdam, has suggested “Maybe regulators should start requiring energy use disclosures from AI developers”?


In my opinion, some of the best solutions optimize usage in order to match power consumption with the objective of the effort. Generic AI models use more energy , while targeted models - such as those designed for a specific use case - consume less.?

There is an AI model called CLOVER that figures out what a user is trying to do and selects a ‘right sized’ model for the task. The tradeoff is that the accuracy of the results drops by 2-4 percent, but it also reduces greenhouse gasses by 75 percent.

Reusing language and image models for training can reduce energy consumption, as can timing the roll out of a model for wide usage. Inference sessions use less energy over time , so when an AI model is released for general use can have an impact on total lifetime energy consumption. If the model is more refined before it is released to a large number of users, it will be more efficient overall.

Perhaps the wildest idea of all came out of the same halls as ChatGPT itself. Sam Altman, CEO of OpenAI, believes nuclear fusion is the answer. That is the reaction at the center of the sun. According to a CNN article about this option, nuclear fusion is decades away, and increases in efficiency are more likely to be associated with increased demand rather than reductions in energy consumption.


I will say this… I actually feel guilty for exploring with ChatGPT when it first came out.

  • Q: What would Art of Procurement podcast shownotes sound like if the cast from Monty Python wrote them?
  • A: Wicked funny. But worth it? I'm not so sure.

We can’t see electricity, and I think that is part of the problem. You don’t watch a tank empty, which means you don’t have to refill it. But there are a lot of other competing uses for electricity - for instance, the forced conversion of passenger cars and tractor-trailers. Where will we be then? I’m pretty sure, nobody knows.

Just remember that it takes nearly 10 times as much energy to do an Internet search via ChatGPT rather than Google. Consumers wield a lot of power… and change can start with us.

Explore other editions of the Art of Supply newsletter here. ?

Alex Armasu

Founder & CEO, Group 8 Security Solutions Inc. DBA Machine Learning Intelligence

5 个月

Thank you for bringing this to our attention!

回复
Peta Duncan

Senior Solution Advisor at SAP Ariba

6 个月

That is eye-opening and sobering information, Kelly Barner! Do you have any info on how those consumption stats change based on conversation complexities? So many thoughts being triggered by this post.

Charles Dunbar ??

Helps Real Estate Investors Maximize Profits via Seller Financing, Note Investing & Private Money

6 个月

Absolutely, educating ourselves about the energy consumption of AI is essential for sustainable choices. It's a complex issue worth exploring further.

Mark Trowbridge, CPSM, CSP, C.P.M., MCIPS

President at Strategic Procurement Solutions, LLC

6 个月

Kelly: Two comments regarding your excellent article: (i) Good thing you rely on KellyGBT for your weekly insights (not ChatGBT)!; and (ii) In the field of procurement, there are multiple ways to reduce electricity and LNG energy costs...especially in deregulated markets across many parts of the USA, Canada, Europe, Australia, etc. (often cost savings of 10% to 25% can be achieved). In this month's edition of Supply Chain Management Review Journal, an article titled Five Strategies for Energy Market Cost & Risk Reduction which I co-authored is being published. The print copies are in the mail but I'll let you know when the publisher puts an outward facing copy on their digital platform. I'll send a PDF by email though for you to read. You might find it interesting...!

Julienne B. Ryan

Author, Narrative Storyteller and Facilitator

6 个月

I love how you approach this topic. It resonates with me and captures some of the points that have been coming up in conversations

要查看或添加评论,请登录

社区洞察

其他会员也浏览了