How to reduce impact of AI on Global Warming?
Source: Igor Sapozhkov/Dreamstime.com

How to reduce impact of AI on Global Warming?

We all know global warming is real and threatening our present and future.

No alt text provided for this image

But ever wondered how is AI impacting global warming?

How does AI impact global warming?

To build any effective AI solution, we need

a) tons of data

b) features/feature vectors from the tons of data

c) GPU/TPU processing power to crunch the features

d) Training and validation

e) Operationalization with champion and challenger models

As an example, as published by Gartner, the number of features has been exponentially increasing over a period of time. The most heavily used GPT 3 uses 175 billion!!

No alt text provided for this image

As a result the AI solutions are becoming power hungry.

Mathematically, an adult human brain can process 56 billion features and consumes 12 watts of energy. GPT3 with 175 billion consumes 3,000,000 watts of energy.

Despite most data centers / cloud stores turning to green energy, by 2030 AI alone will consume 3.5% of world's electricity.

What can we software professionals do to turn the tide?

Even blockchain solutions were scoffed upon by climatologists, but the community is steadily working towards it. Last week 'The Merge' happened successfully.

The Merge refers to the original Ethereum Mainnet merging with a separate proof-of-stake blockchain called the Beacon Chain, now existing as one chain. The Merge reduced Ethereum's energy consumption by ~99.95%.

It is phenomenally inspiring for AI practitioners in my opinion to take the cue.

Let me make some double-click suggestions:

  • Abdicate accuracy for MVP

None of the AI solutions proclaim 100% accuracy, anyways it will get categorized as overfitting. So why to keep on training to achieve the non-existent Holy Grail?

Source: Gartner

The story is similar for validation, which 'typically' follows as a waterfall after training. Can validation be interspersed with training to achieve the speed and right accuracy?

  • Incremental Learning

Classical batch machine learning needs all data to be simultaneously accessed but seldom meets the requirements to handle the large volume, leading to accumulation of unprocessed data. This method also misses out on integrating new information into already constructed models as it regularly reconstructs new models from scratch. This is not only very time consuming but also leads to potentially outdated models.

We can change this to sequential data processing in a streaming manner. This does not only allow us to use information as soon as it is available leading to all-time up to date models but also reduces the costs for data storage and maintenance. It is almost like training and validation are done in an agile way.

Incremental learning does have few challenges like model is constructed without complete retraining and only a limited number of p training examples are maintained

  • Active Learning

This learning is based on the hypothesis that learning algorithm has the liberty to choose data to learn from. Rather than exposing the whole ton of data and do a supervised/unsupervised learning, through 'relevant' queries the focused region of interest in the subset of data is established.

  • Transfer Learning

This learning I had heavily used in past. With the ubiquity of pre-trained models (esp CNN models from ResNet, or available open source) we can plug pre-existing pre-trained model into our data store.

  • Federated Learning

Next level of crowdsourcing is utilized in Federated Learning. It requires each 'client/developer' to train the model locally on their device/server. The data used to train the model does not leave the device/server.

The models(weights, biases etc) are then sent to central server, which averages out the model parameters to create a new master model. This is iterative in nature to achieve desired accuracy levels.

This is super useful in case of privacy requirements of data.

  • Neural Architecture Search

Neural Architecture Search is a classical concept of meta learning. It unleashes the ability of AI to make other AI models better.

It essentially takes the process of a human manually tweaking a neural network and learning what works well, and automates this task to discover more complex architectures.

  • Small and Wide data

Gartner predicts that emerging “small and wide data” approaches will enable more robust analytics and AI, reducing organizations’ dependency on big data. Wide data allows analysts to examine and combine a variety of small and large, unstructured and structured data, while small data is focused on applying analytical techniques that look for useful information within small, individual sets of data.

Conclusion

I believe the whole topic is a part of Green AI, which lies under the umbrella of Responsible AI along with Ethical AI and Compliance.

Practitioners should go beyond ‘conventional’ (deep) machine learning to make AI more energy efficient. Energy usage should be a key AI metric to look out for.

Pushkar Shirolkar

Regional Head - APAC (Asia & Australia)

2 å¹´

Great insights Rahul ??. Fully agree on the responsible and ethical AI - the need of the hour and everyone on this topic should use these fundamentals for any future AI development.

Julien Bardou

*Top Voice LinkedIn* | Global Leader, Coach, Speaker | ?? Top 40 under 40 |??Co-Founder & CTO | ?? Startup advisor & Business angel

2 å¹´

Thanks for sharing your thoughts Rahul, very interesting!

Swapnil Bhave

Product Development Head | Global Service Owner for Commercial Claims

2 å¹´

I think Quantum computing might be the answer for this problem. It will take time till it becomes available commercially. ????????????????Research shows that the highest-scoring deep-learning models are also the most computationally-hungry, due to their huge consumption of data. One algorithm's lifecycle was found to produce the equivalent of 284,000 kilograms of carbon dioxide, which is effectively nearly five times as much as the lifetime emissions of the average American car, including the manufacturing process.? I believe, since Quantum computing takes a much shorter time (million times) to do the computation, it will also be much more efficient in terms of energy and carbon footprint.

PAVAN INDRAKUMAR

Senior Project Manager @ Allianz

2 å¹´

New info for me personally.. good insights.. thanks Rahul ji

要查看或添加评论,请登录

Rahul Sinha的更多文章

  • Power of genuine small talk!

    Power of genuine small talk!

    "Being stuck due to 'Chillai Kalan' harshest winter period in Kashmir and with roads, airport and trains not operating,…

    12 条评论
  • Data is NOT Oil

    Data is NOT Oil

    A new #data universe has been unveiled. With the advent of Generative AI and the democratization of foundational LLMs…

    5 条评论
  • GenerativeAI AppStore - Enterprise Readiness Unleashed

    GenerativeAI AppStore - Enterprise Readiness Unleashed

    I'm thrilled to share a groundbreaking development at TRUGlobal in the world of #artificialintelligence – the…

    3 条评论
  • My 2022 bookshelf

    My 2022 bookshelf

    I am a bookworm :) and this year I was able to take it few notches above my usual annual targets. My bookshelf for this…

    10 条评论
  • How Green Tea helped me steer pandemic

    How Green Tea helped me steer pandemic

    No, this is not a health nugget, though I like many beverages, green tea being one of them :) I joined Allianz…

    4 条评论
  • Book recommendations Q1 2021

    Book recommendations Q1 2021

    With the lockdown continued, I personally used this time of no travel to read a few books, which you might find…

    3 条评论
  • A Quarter of Loss...

    A Quarter of Loss...

    It is said a lot changes in a quarter. Its been a quarter since I lost my mother, and I still find very difficult move…

    27 条评论
  • Saving the Titanic - A digital twin hypothesis

    Saving the Titanic - A digital twin hypothesis

    Loss of the Titanic RMS Titanic was heavily publisized as the unsinkable, when she sailed from Southampton on April 10,…

    1 条评论
  • Era of AI in Insurance

    Era of AI in Insurance

    "Innovation distinguishes between a leader and a follower." - Steve Jobs Technology landscape is evolving rapidly…

    3 条评论
  • Is AI labor intensive?

    Is AI labor intensive?

    Contrary to popular perception that AI will cause disruption of labor and employment, it is extremely labor intensive…

    3 条评论

社区洞察

其他会员也浏览了