Develop Sustainable AI

Develop Sustainable AI

This is what we hear every day...

  • The AI models will continue to grow in the number of parameters, Billion, Trillion parameters, which will demand higher capacities and the ability to move data as high as possible
  • It will take an infinite amount of power to train such a large AI model(s) and cool the computing machines (GPUs, FPGAs etc.)
  • Bring data close to compute or take compute close to data, which is better architecture going forward for AI
  • Already choked network infrastructure will have a huge toll if it turns out that moving data closer to compute is a better architecture
  • The demand and cost of running AI cluster increases exponentially when DCs run at higher utilization and elevated temperatures
  • The environmental regulations are already pushing DCs to get into better PUE, which will need Billions of dollars of investments in reducing power consumption and cooling technologies
  • The arguments above will lead to higher AI enablement cost/ user

The chart above is taken from the blog by Clarissa Garcia

https://www.akcp.com/blog/the-real-amount-of-energy-a-data-center-use/


The chart above is taken from the blog by Rubah Usman

https://www.mckinsey.com/industries/technology-media-and-telecommunications/our-insights/investing-in-the-rising-data-center-economy

https://www.digitalinformationworld.com/2023/07/the-energy-crunch-ai-data-centers-and.html

A million-dollar question, how sustainable is AI? Can we ever meet the insatiable demand for energy and cooling? Let's try to answer some of them

  • No, the AI models will not continue to grow exponentially forever. There are fundamental limitations that will kick in and the industry will start getting into slicing the models and running them as AI agents... similar to microservices, AI Agents will perform a specific function and multiple agents will work together to design a solution. In other words, we will see smaller and more sustainable model deployments by industry
  • Hybrid architectures of bringing data close to compute and taking compute closer to data or we can also call it compute decentralization, we will see more and more AI models running away from the central cloud to on-premise or local infrastructure
  • AI will start peeking into the hardware choices beyond memory, it is good to assume innovation in memory space is coming, the storage has to take some load, and data volume and velocity to feed the hungry AI machines
  • Can we move data packets faster than the speed of light? Already choked networks will start focusing on optical hardware more than ever before to offer multiple times higher bandwidth while consuming less power
  • Design semiconductor for specialized computing, not all data is same, not all processing is same, AI needs a variety of data cleansing, preparation, and staging before models can process for better predictions
  • There are discussions in different forums on alternative energy sources for DCs such as Hydrogen power, Nuclear energy modules for DCs
  • Both direct (power and cooling) and indirect (climate change) costs of AI have already triggered/ or will trigger broader and deeper ROI calculations on AI implementation across the industry segments

Thanks Mayank Mohan ; Ganesh Guruswamy ; Joe Nash ; Adam Black

要查看或添加评论,请登录

Rohit Gupta的更多文章

  • Are RISC processors getting traction for data center use cases?

    Are RISC processors getting traction for data center use cases?

    What is RISC? RISC- Reduced Instruction Set Computer What is an Instruction Set? An instruction set lists commands a…

  • Data Pipeline for AI

    Data Pipeline for AI

    The data pipeline for AI is tailored to the specific needs of machine learning model development and deployment. It…

  • Data Platforms Fueling AI Clusters

    Data Platforms Fueling AI Clusters

    The article stresses the importance of robust infrastructure to support the growing demands of AI, machine learning…

  • Data Platforms Gaining Strength

    Data Platforms Gaining Strength

    Another interesting post on data platforms, enterprises have to have a GPU strategy, not everyone can buy on-premise AI…

  • Keeping GPUs Busy is a Tough Problem to Solve

    Keeping GPUs Busy is a Tough Problem to Solve

    Good article, it emphasizes the current challenges with AI and the need for data platforms. AI consumes any type-…

  • Artificial Intelligence Needs Data Platforms

    Artificial Intelligence Needs Data Platforms

    AI is driving an insatiable need for computing, performance, capacity, power, and cooling at scale. Expectations, of…

    1 条评论
  • If Yahoo was for sale, who should buy it?

    If Yahoo was for sale, who should buy it?

    Let's put down some ideas, who can be the potential acquirer..

    2 条评论
  • So what is machine learning?

    So what is machine learning?

    Well this is my interpretation, so connect 100s/ 1000s of computers in some hierarchy (layers), assign weight to every…

    4 条评论

社区洞察