AI and Electricity: A System Dynamics Approach - Explained (5/10) - "Abundance Without Boundaries" Scenario
Rémi Paccou ??
Sustainability Researcher | Energy System Analysis, Climate Change Mitigation, Sustainable AI/Digital, Data Centers & ICT | PhD Student at CIRED, Chair Prospective Modeling for Sustainable Development
Digital Feast: The AI Buffet in a World of Plenty
Welcome to our fifth series on "AI and Electricity Scenarios: A System Dynamics Approach".
Today, we'll explore the results of "Abundance Without Boundaries" scenario, a perspective developed by the School of Thought of Techno-Efficiency Optimists, which push for unlimited AI deployment across all sectors, believing that technological advancements will solve any resource constraints. This optimism elevates AI efficiency to a totem, lowering computational costs but fueling uncontrolled rebounds in AI demand.
Modelling The Rebound Effect Mechanism
Our System Dynamics model embodies the Jevons Paradox through a rebound effect, which captures how efficiencies gained in AI can lead to lower operational costs, prompting increased usage and, consequently, greater total energy consumption. It presents a future of unbridled AI growth that pushes against planetary boundaries.
The inspiration from Jevons lies in his observation that as technological advancements make energy use more efficient, the overall consumption of energy can actually rise, rather than fall. This paradox highlights the complexity of addressing sustainability in an era of rapid technological progress.
AI Without Boundaries and the Second Principle of Thermodynamics
This emerging school of thought regarding the AI-Technosphere's impact on economic and environmental systems presents a stark contrast to Nicholas Georgescu-Roegen's view on thermodynamic limitations, as outlined in "The Entropy Law and the Economic Process."
While the Techno-Efficiency Optimists envision a future of boundless growth and technological solutions, Georgescu-Roegen's work emphasizes the fundamental constraints imposed by thermodynamics on economic processes, highlighting the inherent limits to growth and technological development.
His thinking reveals that even highly efficient AI systems are subject to entropic constraints, potentially accelerating resource depletion and entropy production globally. Georgescu-Roegen's insights highlight the limitations of technological solutions in overcoming fundamental physical constraints, necessitating a more comprehensive approach to AI integration that aligns with ecological sustainability.
General Analysis of Electricity Consumption Trends (2025-2030)
In this scenario, total AI energy consumption is projected to rise substantially from 100 TWh in 2025 to 880 TWh by 2030, ultimately reaching a staggering 1,370 TWh in 2035. This reflects a robust expansion in AI capabilities and deployment, driven primarily by Generative AI.
Gen AI training is expected to experience substantial growth, increasing from 47 TWh in 2025 to 407 TWh in 2030. This surge highlights the intensive computational demands of training advanced generative models, necessitating significant infrastructure and resource investments.
Energy consumption for generative AI inferencing is also set to grow, from 15 TWh in 2025 to 416 TWh by 2030. This indicates a broadening application of generative models across industries, as they move from research to practical deployment.
Traditional AI training remains stable at 20 TWh over the period, while inferencing grows from 18 TWh to 37 TWh. These figures suggest incremental improvements and optimizations rather than major shifts in traditional AI practices.
Unrestricted abundance could fundamentally alter existing systems, constrain decarbonization trajectories and generate waste
Taking a broader perspective, these electricity evolutions imply, directly or indirectly, potential consequences. These risks include the possibility of excessive infrastructure investment leading to stranded assets, the emergence of a trilemma of power concentration, supply chain vulnerabilities, and competition for critical resources, material and minerals, as well as a potential increase in AI-related electronic waste.
Gen AI training compute for Large Language Models grows by 5-6 times annually, while language training dataset sizes expand by 3.5 times yearly, exceeding 50 trillion tokens. In the Abundance without Boundaries scenario, computational performance improves dramatically, with overall performance increasing by 1.5 times annually. This translates to a 50% annual improvement in TFLOPs performance. Simultaneously, energy efficiency sees remarkable gains, with GFLOPs/Watt efficiency improving by 50% annually.
In the Abundance without Boundaries scenario, these remarkable improvements reflect the theme of unrestricted technological advancement and exponential growth in AI capabilities. Algorithmic efficiency in language models improves fourfold yearly with ambitious scaling, pushing the boundaries of computational efficiency. This could include widespread use of mixed precision training with FP8, BF16, and even lower bit-width formats, alongside the emergence of AI-specific number systems optimized for energy efficiency at scale. However, the crisis might also spur rapid innovation in ultra-low precision formats and specialized hardware.
领英推荐
Gen AI inferencing capabilities expand exponentially, penetrating a wide range of sectors, including aerospace, defense, biotech, healthcare, and finance. Traditional AI experiences widespread adoption, further fueling the demand for computational resources. At the data center level, the development of room-temperature superconductors and advances in bio-computing drive progress as well.
Exogenously, while breakthroughs in quantum computing by 2029 offer substantial potential to decarbonize energy systems, full development and large-scale deployment will take time. Similarly, recent announcements have sparked renewed interest in nuclear energy as a potential solution to future AI abundance. However, while nuclear power offers a carbon-free energy source, it faces significant challenges, including high initial investment costs, ranging from $3,000 to $6,200 per kilowatt and vulnerability to rising interest rates during lengthy construction periods. Consequently, in the AI Abundance without boundaries scenario, AI’s rising electricity demand may still rely on fossil fuels in the interim. This reliance could result in continued environmental impacts —including increased land use, water consumption, resource depletion, and waste.
These impacts may persist until quantum technologies are widely adopted and capable of significantly reducing carbon emissions, or until nuclear power can operate at scale. The system dynamics mechanism is characterized by a reinforcing rebound loop of investment-driven growth and infrastructure development, leading to rebound cycles of increasingly powerful AI and expanded data center capacity. It pushes the boundaries of societal norms and environmental limits, raising concerns about power centralization, resource depletion, and governance.
Oversized AI infrastructure is prone to risks of unsustainable operational costs and inefficient resource utilization
The AI Abundance without Boundaries scenario observes that the rapid and unrestrained development of AI systems can pose a risk of a constant race towards bigger and more powerful infrastructure, often outpacing the capacity for sustainable maintenance. A study by OpenAI estimated that the compute used in the largest AI training runs has been doubling every 3.4 months since 2012, far outpacing Moore’s Law. This relentless growth raises concerns about the construction of increasingly massive data centers, many of which may risk becoming obsolete. This is exemplified by Meta’s recent decision to demolish an outdated data center design. This issue, embedded in our modeling as a trigger for a Jevons Paradox risk, is confirmed by simulation results projecting a significant energy consumption increase from 100 TWh in 2025 to 880 TWh by 2030, which could accelerate the cycle of increased demand and resource depletion.
This potential rebound is highlighted in recent nuclear hype announcements that indicate that, while nuclear energy might be a possible solution to support the electricity needs of AI Abundance without Boundaries, current debates often dismiss the fact that these projects require substantial initial investments, which are estimated to range from $3,000 to $6,200 per kilowatt for new plants in the United States. At a discount rate of 10%, the median cost of nuclear energy can exceed that of natural gas and coal plants. This is largely due to the capital-intensive nature of nuclear power, where capital costs account for at least 60% of the levelized cost of electricity (LCOE). Moreover, a risk remains that high interest rates can significantly inflate these costs, particularly over lengthy construction periods during which no revenue is generated, leading to compounded interest expenses that can jeopardize project viability.
In contrast, countries like China and India have managed to achieve more competitive nuclear economics through ongoing construction experience and lower labor costs. This scenario highlights the risk of building a Gen AI-standard asset legacy (which may be outdated with 2030 models such as World Model, potentially trapping Gen AI) due to the rapid buildup of oversized infrastructure. This infrastructure, lacking long-term design or refurbishment considerations, may become unsustainable and burden future generations.
Insufficient governance and concentrated power could exacerbate AI access inequality
This scenario, driven primarily by extreme demand growth and exogenous factors like concentrated power and access to resources, could lead to increased AI-access inequality and requires robust governance structures to mitigate these risks. Over time, the concentration of power could transform a small number of tech giants and nations from pioneers into entrenched gatekeepers of the AI landscape. In 2024, the top tech companies - Apple, Microsoft, NVIDIA, Alphabet, Amazon, and Meta - expressed concern as their combined market value exceeded the GDP of most countries, reaching a staggering $15.2 trillion, surpassing the GDP of any single country except for the United States and China.
Such economic power, particularly in the semiconductor industry—with key players like TSMC (market capitalization (market cap of $1.07 trillion on October 17, 2024, Broadcom (market cap of $857.70 billion as of November 8, 2024), and Samsung (market cap of $276 billion USD as of November 2024) consolidating their power in advanced chip manufacturing may lead to potential issues over time, including reduced competition, higher prices, and uneven innovation. A potential risk to address in maintaining a resource-symbiotic AI development is the supply of critical minerals. China’s dominance in producing 68% of the world’s rare earth minerals and 77% of all graphite production and 97% of global anode output could intensify the trilemma of power concentration, supply chain pressure, and competition for these critical resources.
This trilemma could widen the gap in AI access and capabilities, where organizations or nations with greater resources benefit disproportionate access to AI development. This disparity is evident in the distribution of generative AI resources, with Meta deploying at least 16,000 A100 GPUs in its Research Super Cluster, Tesla claiming 7,360 A100 GPUs, and the EU’s Leonardo supercomputer, utilizing 13,824 A100 GPUs. This concentration of computational resources in a few large corporations and national facilities highlights the potential for widening gaps in AI capabilities. Companies or states with access to cutting-edge AI tools might experience productivity gains of 30-40% in specific knowledge work tasks, while those without such access struggle to compete, creating a new form of digital divide that exacerbates economic and social disparities on a global scale. Consequently, this scenario raises the importance of robust governance structures to manage potential unchecked development that could further entrench these inequalities.
Importantly, in this environment of unbounded AI abundance, certain key tipping points could trigger shifts between scenarios. These triggers, which involve feedback loops that confront practical boundaries -societal, physical, material, environmental, etc. - can introduce constraints that ultimately lead to the Limits to Growth scenario or, more extremely, the Energy Crisis scenario.
Unrestrained AI development can worsen the e-AI-Waste Dilemma by prioritizing performance over practicality
In the AI Abundance Without Boundaries scenario, the rapid development of Generative AI, if not effectively managed, might pose a significant risk to the global e-waste crisis. By 2030, Generative AI could potentially generate between 1.2 million and 5 million metric tons of e-waste annually, primarily from high-performance computing hardware in data centers, including servers, GPUs, CPUs, memory modules, and storage devices. This could represent a potential 1000-fold increase from current AI-related e-waste levels, exacerbating the existing global e-waste problem, which already exceeds 60 million metric tons annually. The primary contributors to this potential surge could be the short lifespans of hardware - typically two to five years - and the rapid turnover of technology as companies might strive to keep up with advancements in AI capabilities. Within this scenario, as the compute requirements for training large language models (LLMs) could increase by 5-6 times annually, with language training datasets potentially expanding 3.5 times yearly, a risk might occur of a relentless waste of discarded Gen AI-specific electronics. These electronics often contain hazardous materials such as lead, mercury, and chromium, which could pose serious environmental and health risks if not disposed of properly. In this scenario, while circular economy strategies - such as extending hardware lifespan and recycling components - might reduce e-waste generation by up to 86%, challenges would likely persist. While we didn’t quantify in our modeling the potential e-waste output as part of indirect effects, we consider that pure AI efficiency will not naturally save AI waste, which is a specific topic to address, notably by designing AI-ready equipment for refurbishment.
Thank you for having reading the "Abundance Without Boundaries" scenario. In the next episode, we will present the results for the scenario: "Energy Crisis" from the School of Thought: Alarmists. It envisions a future where the rapid growth of AI triggers an unforeseen energy crisis, creating significant disruptions across industries and economies. Alarmists in this scenario anticipate and react to "black swan" events in AI energy consumption, highlighting the risks of unchecked AI expansion and inadequate planning. This scenario draws inspiration from Thomas Robert Malthus's theory of population growth and resource scarcity, which proposed that unchecked growth would lead to catastrophic collapse. In our model, we incorporate this concept through the "crunch" mechanism, representing a tipping point where AI's energy demand suddenly outstrips available supply, causing widespread disruptions. Like Malthus's prediction of a population crash due to resource scarcity, our crunch mechanism models how unrestrained AI growth could trigger an energy crisis affecting entire economies and societies.
Looking forward to sharing this soon!
The "Abundance Without Boundaries" scenario raises important concerns about AI's rapid growth, energy consumption, and e-waste. It's crucial to balance innovation with sustainability and social equity.
SEO and Lead Generation Expert for Solar, Heat Pumps, Boilers and EV Chargers.
2 个月Fascinating insights on AI's energy impact. Have we considered implementing global standards for sustainable AI development? ?? #TechSustainability