AI's energy footprint: Myth vs. Reality
Rahul Bhushan
Managing Director, Europe at ARK Invest (Co-Founder of Rize ETF) | Thematic and Sustainable Investing Strategist | Author | Angel Investor | Father of two ????????
As AI becomes more integrated into modern society, concerns about its energy consumption are sparking widespread debate. Greenpeace have warned that AI’s growing energy demands could hinder efforts to combat climate change.[1] Similarly, Goldman Sachs Research has raised concerns about the environmental impact of AI’s energy use.[2] However, a closer examination reveals a more nuanced relationship between AI, energy efficiency and decarbonisation. In this essay, I explore the complexities of AI’s energy consumption, debunk common myths and highlight advancements in energy efficiency.
AI’s Insatiable Appetite
AI is often criticised for its significant energy consumption, particularly by data centres that support its operations. At the end of last year, a research paper in the MIT Sloan Management Review that a single ChatGPT query produces 100 times the carbon emissions of a typical Google search.[3] According to the International Energy Agency, global energy demand from data centres, cryptocurrency and AI, collectively, should double by 2026.[4] One forecast from the University of Pennsylvania’s School of Engineering and Applied Science sees computing technology?consuming 8% to 21% ?of global energy by 2030, with data centres accounting for a third of that usage.[5]
The U.S. Department of Homeland Security has formed an AI Corps to advise on AI’s use within the federal government[6], while the Department of Energy is examining data centre energy needs.[7] Climate activists have proposed more drastic measures like carbon taxes to curb AI’s energy usage, emphasising the need to conserve energy to lower greenhouse gas emissions.[8]
Real World AI Applications
Despite alarming projections, current data suggest that AI’s energy consumption might be less severe. Guy Berger from the Burning Glass Institute points out that AI applications are not yet widespread.[9] Finale Doshi-Velez, a professor at Harvard University, notes in a recent report that while AI has made significant strides in specific tasks, its mainstream adoption remains limited due to the lack of ethical and regulatory oversight.[10] Even the World Economic Forum highlights findings from Stanford University’s AI Index, which indicate that although there is substantial investment in AI, practical applications are still lagging.[11]
For example, only 2.5% of U.S. businesses use AI for marketing automation and 1.9% for virtual agents.[12] Additionally, large language models are currently only employed by a mere 1% of businesses.[13]
Moreover, estimates of AI’s energy consumption are frequently based on projections rather than concrete data, which can lead to significant uncertainty. For instance, Goldman Sachs Research forecasts a 500% increase in AI-related power demand in the UK over the next decade,[14] while U.S. data centres are projected to account for 8% of the nation’s total electricity consumption by 2030, up from 3% in 2022.15 Reports from Boston Consulting Group and Rystad Energy predict that data centres’ energy use will rise to 307 TWh by 2030.[16]
Is History Any Guide?
To put this into perspective, we can draw parallels to past technological predictions that did not play out as expected. During the early 2000s, there were widespread concerns about the internet’s energy consumption. Analysts predicted that the internet’s power usage would soar exponentially, leading to unsustainable increases in energy demand.[17] For example, a study in the late 1990s estimated that the internet could consume up to 8% of the U.S.’ electricity by 2005.[18] However, these predictions vastly overestimated the actual energy consumption, as technological advancements in energy efficiency mitigated the projected demand.[19], [20]
These historical examples illustrate how projections can often be speculative and subject to significant revision. Hence, while current estimates about AI’s energy consumption highlight potential trends, they should be viewed with caution and an understanding of their inherent uncertainties.
The Reality of Efficiency Gains
But, the story doesn’t end there. Significant efficiency gains are already being made. Energy intensity in data centres (energy use per computation), for example, has decreased by 20% annually since 2010.[21] Nvidia, the world’s leading GPU designer, has developed its new Blackwell chip, which uses 25 times less energy than its predecessor.[22] Additionally, major advancements have been made in data centre design and management. For instance, large data centres are now achieving significantly better Power Usage Effectiveness (PUE) metrics.[23]
To put this in context, the average PUE of data centres was around 2.0 in 2007, meaning that for every 2 units of energy consumed by the facility, only 1 unit was used for computing and the rest for cooling and overhead.[24] Today, industry leaders like Google and Amazon Web Services report PUEs of 1.2 or lower at some sites (a PUE of 1.2 means that for every 1.2 units of energy consumed, 1 unit is used for computing and only 0.2 units are used for cooling and other overhead).[25]
Based on our research, we forecast several strategies that are likely to enhance data centre energy efficiency moving forward, namely:
Comparative Energy Demands
Interestingly, EVs may soon match or exceed AI’s energy demand. The Princeton REPEAT model estimates U.S. electricity demand for EVs at 391 TWh by 2030, comparable to the 320-390 TWh projected for data centres.[26] Despite this, EVs are widely promoted (as opposed to demonised) due to their role in reducing transportation emissions, highlighting a disparity in how new technologies are perceived (and understood).
Meeting Future Energy Needs
To address AI’s energy demands, therefore, technology companies must explore diverse energy sources. Microsoft has partnered with Constellation Energy to supply nuclear power to its data centres,[27] while Google has teamed up with Fervo Energy for geothermal power.[28] Amazon Web Services recently purchased nuclear-powered data centre campus Cumulus from Talen Energy,[29] and Meta already supports 100% of its data centres with renewable energy (achieved through significant investments in wind and solar, making Meta one of the largest corporate buyers of renewable energy globally, with data centres designed for high efficiency, water restoration, and LEED Gold certification).[30] Hydropower is also being considered to meet the increasing electricity needs of data centres.
Innovation over Apprehension
Rather than fearing AI’s energy impact, the focus should be on proactive solutions. By improving energy efficiency and diversifying clean energy sources, we can balance AI’s benefits with its energy demands. The discourse must shift from apprehension to innovation, ensuring that AI contributes positively to a high-energy, sustainable future.
References
Abonnez-vous à mon infolettre gratuite Global Fintech Insider
1 个月Great read!
Asian Equities Investment Analyst
2 个月EVs needing as much power as data centers puts this in perspective. power management going to be a big theme going forward ??
Managing Director, Europe at ARK Invest (Co-Founder of Rize ETF) | Thematic and Sustainable Investing Strategist | Author | Angel Investor | Father of two ????????
2 个月ARK Invest Europe ARK Investment Management LLC Sustainable Market Strategies