The Environment and GenAI
David Atkinson
AI Legal Counsel | A.I. Ethics and Law | University Lecturer | Veteran
This is Part 3 of our mini-series on GenAI’s effect on the environment and energy consumption.
Perspective
To address the argument that LLMs harm the environment, one must also weigh LLM’s energy usage compared to other daily activities. According to one paper, “A round-trip flight from New York City to San Francisco emits about 1 tonne of CO2e per passenger. So the ~500 tonnes of CO2e required to train GPT-3 equates to the emissions of approximately two or three full round-trip flights from New York City to San Francisco.”[1]
Additionally, compared to other large corporations, tech giants are more sustainable. Many tech companies (like Google) have procured corporate power purchase agreements (CPPAs) to acquire green energy from wind and solar farms. In 2017, Google reached 100% renewable energy with this tactic. Other companies are following suit. Microsoft has a target for 100% renewable energy in 2025. This domino effect has pressured other tech companies to do the same. “In 2020 alone, Google, Amazon, Facebook, Apple, and Microsoft procured 7.2 GW of renewable capacity, which is almost 30% of all CPPAs, or around 3.5% of all global renewable capacity additions.”[2]
Furthermore, we can consider LLMs in relation to other technologies. LLMs use less energy than the cryptocurrency market, and they have the potential to “accelerate scientific discovery or alter the work of white-collar professionals worldwide.”[3] Bitcoin mining, so far, does not offer the same benefits to society. Daily human activities like using a water kettle, an electric oven, TV, google search, emailing, messaging, and video streaming all use a significant amount of energy as well. Perhaps instead of worrying about LLM’s environmental impact, we should focus on the broader global goal of switching to renewable energy for most/all energy-consuming entities.
Unfortunately, the largest models are often used for trivial tasks. For all their power, LLM usage is often used for simple daily tasks, similar to using a kettle or electric oven. Continuing the authors' earlier airplane analogy, they state: “instead of two or three full Boeing 767s flying round-trip from New York to San Francisco, current provision of consumer LLMs may be more like a Boeing 767 carrying one passenger at a time on that same journey.”[4] Most current consumer interactions with LLMs are unimportant to society or are simple queries that could be handled by a smaller, less environmentally impactful model or search engine.?
Another consideration is that if LLMs continue to scale and gain popularity, as many have predicted, they may very well become a significant consumer. In 2018, GPT had 100 million parameters; a few years later, Google's PaLM model had 500 million.[5] Without a doubt, many LLMs are getting bigger every year and there are certainly more of them. As models increase in size and number, so does their training time and the number of GPUs required to provide inference, which means more energy is consumed.
Specialized models can be made more accessible to the AI community so that expert teams can fine-tune them as needed instead of building new models from scratch. Plus, dedicating LLMs to specific categories of tasks could allow engineers to avoid disrupting overall accuracy for their intended use while having less of a negative impact on the environment.
Such an approach may seem to “reverse” the accelerated progress made in AI; patterns we see in this industry are based on the belief that implementing more parameters is the key to better overall performance. However, this hypothesis is unproven. As pioneering AI researcher Rich Sutto states: “The bitter lesson is based on the historical observations that 1) AI researchers have often tried to build knowledge into their agents, 2) this always helps in the short term, and is personally satisfying to the researcher, but 3) in the long run it plateaus and even inhibits further progress, and 4) breakthrough progress eventually arrives by an opposing approach based on scaling computation by search and learning.”[6]?
Tech May Be The Answer
There are some possible ways to reduce the environmental impact and carbon footprint of LLMs, such as:
From Scientific American
Wattlime, an organization that monitors electricity-related emissions, is one of the founding partners of Climate TRACE, a nonprofit dedicated to tracking global pollution sources. Climate TRACE uses computer vision and machine learning to detect emissions from monitored facilities. It then utilizes satellite imagery to visually pinpoint the emission-causing activities, such as steam plumes from factories. Engineers train algorithms using this data, enabling the programs to estimate emissions based solely on visual input. The collected emission data helps corporations reduce their carbon footprint, informs policymakers, and facilitates holding polluters accountable for their environmental impact.
There are potential opportunities to use Al to reduce carbon emissions and support clean energy technologies. It can be crucial for Al to integrate renewable energy sources such as wind and solar into the electric grid. However, it all depends on the transparency of the company. Businesses and companies are using Al to work more efficiently, but one of the essential tasks is to increase awareness among Al users to be aware of how and how much they are using Al.[15]
Tech May Not Be the Answer
The same libertarian/amorality that many in tech circles like to espouse when discussing AI regulation, misinformation, disinformation, nonconsensual deepfakes, and more applies to energy and the environment.
Scientific American provides a few examples of how an amoral stance on how AI is used could just as easily accelerate harms to the environment as it can help it:
Take the fossil-fuel industry. In 2019 Microsoft announced a new partnership with ExxonMobil and stated that the company would use Microsoft’s cloud-computing platform Azure. The oil giant claimed that by using the technology—which relies on AI for certain tasks such as performance analysis—it could optimize mining operations and, by 2025, increase production by 50,000 oil-equivalent barrels per day. (An oil-equivalent barrel is a term used to compare different fuel sources—it’s a unit roughly equal to the energy produced by burning one barrel of crude oil.) In this case, Microsoft’s AI is directly used to add more fossil fuels, which will release greenhouse gases when burned, to the market.[16]
When asked for comment, Microsoft’s spokesperson embraced the amoral approach, stating that “the company sells its technology and cloud services to ‘all customers, inclusive of energy customers.’”
The Atlantic Expose
The Atlantic published an article more recently that also dug into Microsoft. The journalist reviewed several internal documents showing that Microsoft actively and aggressively sought lucrative business relationships with fossil fuel companies to help those companies maximize oil and gas extraction. The value of the agreements ranged up to $75 billion.
An example from the articles notes that “In March 2021, for example, Microsoft expanded its partnership with Schlumberger, an oil-technology company, to develop and launch an AI-enhanced service on Microsoft’s Azure platform. Azure provides cloud computing to a variety of organizations, but this product was tailor-made for the oil and gas industries, to assist in the production of fossil fuels, among other uses.”
One Microsoft employee who attempted to stem Microsoft’s support for fossil fuel extraction used the example of “a single 2019 deal with ExxonMobil that could purportedly ‘expand production by as much as 50,000 oil-equivalent barrels a day by 2025,’ according to a Microsoft press release. Those extra barrels would produce an estimated 6.4 million metric tons of emissions, drastically outweighing a carbon-removal pledge that Microsoft made in 2020, she wrote.”
The article highlights Microsoft’s unwillingness to adapt its “Responsible AI” principles to address the issue. “Around this time, Microsoft instead released a new set of principles governing the company’s engagements with oil and gas customers. It was co-authored by Darryl Willis, the corporate vice president of Microsoft’s energy division (and a former BP executive who served as BP’s de facto spokesperson during the Deepwater Horizon crisis).”
But the current financial incentives may be too much for Microsoft to ignore. “Lucas Joppa, Microsoft’s first chief environmental officer, who left the company in 2022, fears that the world will not be able to reverse the current trajectory of AI development even if the technology is shown to have a net-negative impact on sustainability. Companies are designing specialized chips and data centers just for advanced generative-AI models. Microsoft is reportedly planning a $100 billion supercomputer to support the next generations of OpenAI’s technologies; it could require as much energy annually as 4 million American homes. Abandoning all of this would be like the U.S. outlawing cars after designing its entire highway system around them.”
Source: https://www.theatlantic.com/technology/archive/2024/09/microsoft-ai-oil-contracts/679804/
Another example proffered pertains to AI-supported advertising:
When an eerily specific ad pops up on your Instagram or Facebook news feed, advertising algorithms are the wizard behind the curtain. This practice boosts overall consumptive behavior in society, Rolnick says. For instance, with fast-fashion advertising, targeted ads push a steady rotation of cheap, mass-produced clothes to consumers, who buy the outfits only to replace them as soon as a new trend arrives. That creates a higher demand for fast-fashion companies, and already the fashion industry is collectively estimated to produce up to eight percent of global emissions. Fast fashion produces yet more emissions from shipping and causes more discarded clothes to pile up in landfills. Meta, the parent company of Instagram and Facebook, did not respond to Scientific American’s request for comment.[17]
The theme here, as woven throughout this document, is that AI does not inherently equal a better world. There must be intentionality around its use: Who is it designed for? Who’s allowed to use it? For what purpose?
Data Centers
Within the current state of environmental impact relating to data center operations and management that enable AI processes, a disparity exists between rural and urban communities. Data centers tend to be noisy when the cooling fans engage to decrease the temperature of the servers. While the servers themselves tend not to generate much noise on an individual basis, when the data centers have hundreds or thousands of servers in a compact space, the noise can reach levels of 96 db(A) in tandem with the HVAC system noise. Sounds that reach 85 decibels or higher have the increased potential of harming someone’s ears, especially over consistent, long-term periods; therefore, one takeaway from data center operations is the enhanced risk of noise (or sound) pollution.?
In terms of water, rural towns bear the brunt of data center impacts on their environment because it is “out of sight, out of mind.” Generative AI consumes between 1 million and 5 million gallons of water a day.
Not only are the data centers large infrastructure projects that require substantial construction material, but they also consume amounts of electricity similar to what small towns require, employ only a small number of local residents, and use millions of gallons of water.?
There are some important implications to consider when balancing the utility of data centers and their impact on rural communities. For example, to what degree should local elected officials be held accountable for the potential damage they have allowed relative to tech companies' responsibility??
Yet, even with news such as Britain claiming that data centers will consume nearly 6% of all UK electricity by the end of this decade, advances in AI efficiency may lessen the impact. Microsoft, for example, says data center workload increased by 9x between 2010 and 2020, but electricity use only increased by 10%.
One truth remains constant in a historical context – the rural communities and the American West still serve as the location of the modern gold rush. Maybe this time we can engage in more responsible practices from the start.?
[1] https://www.cutter.com/article/environmental-impact-large-language-models
[2] https://www.cutter.com/article/environmental-impact-large-language-models
[3] https://www.cutter.com/article/environmental-impact-large-language-models
[4] https://www.cutter.com/article/environmental-impact-large-language-models
[5] https://arstechnica.com/gadgets/2023/04/generative-ai-is-cool-but-lets-not-forget-its-human-and-environmental-costs/
[6] https://www.incompleteideas.net/IncIdeas/BitterLesson.html#:~:text=The%20bitter%20lesson%20is%20based,further%20progress%2C%20and%204
[7] https://openai.com/research/ai-and-efficiency
[8] https://research.ibm.com/blog/analog-ai-chip-low-power
[9] https://www.reuters.com/business/energy/us-scientists-repeat-fusion-power-breakthrough-ft-2023-08-06/
[10] https://www.unite.ai/why-microsofts-orca-2-ai-model-marks-a-significant-stride-in-sustainable-ai/
[11] https://allenai.org/climate-modeling
[12] https://www.forbes.com/sites/markminevich/2022/07/08/how-to-fight-climate-change-using-ai/?sh=55911a92a838
[13] https://www.cbsnews.com/news/jeff-bezos-space-heavy-industry-polluting-industry/
[14] https://www.nature.com/articles/s43588-023-00459-6
[15] https://www.scientificamerican.com/article/ais-climate-impact-goes-beyond-its-emissions/
[16] https://www.scientificamerican.com/article/ais-climate-impact-goes-beyond-its-emissions/
[17] https://www.scientificamerican.com/article/ais-climate-impact-goes-beyond-its-emissions/
The following students from the University of Texas at Austin contributed to the editing and writing of the content of LEAI: Carter E. Moxley, Brian Villamar, Ananya Venkataramaiah, Parth Mehta, Lou Kahn, Vishal Rachpaudi, Chibudom Okereke, Isaac Lerma, Colton Clements, Catalina Mollai, Thaddeus Kvietok, Maria Carmona, Mikayla Francisco, Aaliyah Mcfarlin