The Environmental Impact of AI Quantified

The Environmental Impact of AI Quantified

In this issue, we delve into the critical impact that training and running AI models is having on the environment and look at a tool developed by the team at Hugging Face to help people quantify the footprint of their AI tool.?

You can subscribe to and support XR AI spotlight right on Substack , with new issues appearing directly in your inbox, plus bonus issues only available to our supporters.


Subscribe to the newsletter and get a list of 200+ MR Apps

If you’d like to sponsor this newsletter, and get your name in front of an engaged audience of professionals and creatives, just contact me here on LinkedIn ?? Gabriele Romagnoli


Interview with Alex de Vries-Gao

How can we measure the environmental impact of AI?

Alex de Vries: It's not easy. At the bear minimum measuring the environmental impact of AI requires data like energy consumption, which is already a challenge since AI companies aren't disclosing much. In my research, I looked at the supply chain of high-end AI servers. Even if you get a number, you can't confirm it or determine the environmental footprint without knowing where the physical serviers are and what kind of power they use.


Image from Presentation of Sasha Luccioni (source below)

What makes AI so energy-intensive?

Alex de Vries: AI doesn't have to be energy-intensive inherently. It depends on the application you're building and how you design your model. However, in AI, bigger models are generally better because they perform more robustly and attract more users, creating an economic incentive to make larger models. This leads to higher energy consumption for both training and running these models, which is the fundamental problem from an environmental standpoint.


CO2 used to train various LLMs (source below)

Can you explain the difference between the energy used for training a model and running it?

Alex de Vries: Sure. Training a model is energy-intensive, but if no one uses it, the energy impact stops there. With ChatGPT, the situation changed because of its massive adoption. I estimated that training the GPT-3 model used around 1,000 to 1,500 megawatt hours, but running ChatGPT daily costs OpenAI about 500 to 600 megawatt hours a day. So, in just three days of operation, running the model consumes more energy than the entire training process.

Does the usage of ChatGPT include API calls from other tools?

Alex de Vries: My analysis focused on the servers that OpenAI runs in the background. It didn't include the energy consumption from devices users are on, like phones or desktops, which also add to the total energy use. The overall scale is likely larger if you consider these additional factors. For a complete picture, you'd need to include lifecycle costs, manufacturing energy, and energy used by all user devices.

How does the energy consumption of AI compare to other activities like Google searches?

Alex de Vries: A standard Google search uses about 0.3 watt-hours, while a ChatGPT interaction can use up to 6 watt-hours. To put that in perspective, 6 watt-hours is like keeping a low-lumen LED bulb on for an hour. If Google were to switch entirely to an AI-powered search engine like ChatGPT, the energy needed would be equivalent to the entire electricity consumption of Ireland, illustrating the massive scale of potential energy use.

What about the environmental impact of other generative AI outputs, like images and videos?

Alex de Vries: Generative AI outputs like images and videos are also energy-intensive, though specific numbers are harder to come by. For example, generating a single AI image might be equivalent to charging your phone. With video, the scale increases exponentially since you're generating 24 frames per second. All these add up quickly and could have significant environmental impacts as these technologies become more widespread.

How does the future of AI energy consumption look?

Alex de Vries: The demand for AI servers is so high that just looking at the supply chain can give us an idea. In the next two to three years, we might see hundreds of thousands of these energy-hungry machines being produced. By 2027, AI servers alone could require as much energy as the entire Netherlands, adding another 50% to historical data center power demand. It's a significant concern that needs to be addressed.

What role do big tech companies play in AI's environmental impact?

Alex de Vries: Big tech companies like Google and OpenAI are aware of AI's energy requirements and its sustainability issues, but they're doing little to mitigate it. For instance, Alphabet's chairman mentioned AI in Google search would increase their energy consumption tenfold. While companies like NVIDIA claim hardware will become more efficient, this might just lead to more AI servers being used, exacerbating the problem. These companies need to prioritize responsible AI use.

What do you think should big tech companies do?

Alex de Vries: The first steo would be to be more transparent about their energy consumption. Transparency is crucial for assessing and addressing AI's environmental impact. However, we're moving in the opposite direction. Companies are increasingly withholding data on their energy consumption due to competitive concerns. This lack of transparency makes it difficult for researchers and policymakers to understand and mitigate AI's environmental footprint. We need regulations to ensure these companies disclose relevant data.

What steps should we take to address AI's environmental impact?

Alex de Vries: We need better data to start. Policymakers must pay attention to this issue and possibly implement regulations to increase transparency. Once we have solid data, we can assess the true impact and take informed actions. It's also crucial for companies to apply AI responsibly, not forcing AI into every application but rather finding the best fit solutions for specific problems. Overhyping AI as a miracle solution isn't sustainable in the long run.

Check out the full interview right here ??



Product Spotlight: Code Carbon

CodeCarbon is a lightweight software package that seamlessly integrates into your Python codebase. It estimates the amount of carbon dioxide (CO2) produced by the cloud or personal computing resources used to execute the code.

It then shows developers how they can lessen emissions by optimizing their code or by hosting their cloud infrastructure in geographical regions that use renewable energy sources.


If you want to dive deeper into the topic I would also highly recommend this TED Talk from Dr. Sasha Luccioni . She is the AI and Climate Lead at Hugging Face and one of the minds behind Code Carbon and other key initiatives to emphasize and mitigate the environmental impact of AI.

That’s it for today, and don’t forget to subscribe to the newsletter if you find this interesting

See you next week

要查看或添加评论,请登录

社区洞察

其他会员也浏览了