AI – LLMs, LQMs, and the Energy Challenge

AI – LLMs, LQMs, and the Energy Challenge



In this article, I’ll update my understanding of the current status and future prospects of AI, especially following the incredible exchanges I had at the #WEF2025. Let me share my candid thoughts.

?

How Big Is AI?

Few innovations have been truly transformative for humanity. Fire, the wheel, and electricity come to mind — each one fundamentally changed the trajectory of human progress, driving exponential growth.

Interestingly, these innovations enhanced our physical capabilities and amplifed human strength and efficiency.

More recently, two groundbreaking innovations have significantly expanded our cognitive abilities: Gutenberg’s printing press and the internet. Both of these innovations have vastly increased our memory and access to information — essentially creating a "super hard drive" for our brains.

AI, however, holds the potential to be the third transformative innovation, enhancing not just our memory or knowledge, but the very way our brains process and interpret information.

  • It’s akin to the arithmetic co-processor in my first PC?(though only people of a certain age may remember that!).
  • AI enables the human brain to process patterns across vast amounts of data and dimensions?— and the world, after all, runs on patterns.
  • While we can’t be certain, we’re on the verge of unlocking some profound understanding, and we’re already starting to see the early results.

What sets AI apart from earlier innovations, like fire or electricity, is its potential for self-improvement. AI systems can improve themselves by leveraging other AI programs, creating a compound effect that is unique, thus carrying the potential of the most impactfull transformative innovation.

This is a possibility, but not yet a certainty.

?

?

LLMs: Great Promise, But Strong Limitations

LLMs (Large Language Models) are receiving widespread attention, with some even claiming that AGI (Artificial General Intelligence) is just around the corner. There’s no doubt about the value of LLMs for applications like document generation, video and image creation, customer service, and marketing. However, some of the loftier promises still seem out of reach.

Consider this:

  • Ask a 10-year-old to clear the table. Without prior experience, they'll likely figure it out. But are we anywhere near developing a robot that can perform such a task in a household?
  • Ask an 18-year-old to drive a car. After a few hours of practice, they can navigate traffic and drive in different conditions. But are we close to achieving Level 5 autonomous driving, where a car can drive itself safely in any situation?

Achieving such feats would require a major breakthrough in technology. Like the invention of transformers for the LLMs we use every day, we need a leap that allows machines to understand the world, memorize actions, plan, and anticipate how the world will change after taking action.

?

?

LQMs: A World of Potential

Large Quantum Models (LQMs) are one of the most exciting emerging areas in AI. These models leverage machine learning on large datasets, physics-based simulations, and advanced computational techniques to make predictions and solve complex problems.

LQMs hold significant promise for various industries, including:

  • Material discovery: They could revolutionize fields like battery chemistry, enabling the development of more efficient energy storage solutions.
  • Biopharma: LQMs can enhance drug discovery, validate scientific hypotheses, and improve healthcare outcomes.

Interestingly, many corporate labs are already leveraging the immense capabilities of Physic based simulation, by leveraging the incredible experience and breadth of the algorithms of Wolfram Alpha.

It is no coincidence, that the #SandboxAQ, created in 2022, already raised $800 million, under the triple promise of LQM, Quantum, and B2B platform business model…. To follow.

?

?

What About the Energy Needs?

It’s no secret that AI can be incredibly energy-intensive. However, there are promising developments that could address these challenges.

For example,?#MIT?recently spun off a company called?Liquid.ai, which claims to have developed an AI model that is 1,000 times more energy-efficient. This AI is based on?Liquid Neural Networks (LNN)?— a type of network that uses differential equations instead of traditional linear combinations.

  • LNN technology improves energy efficiency?by processing data more effectively, concentrating on the most relevant information. For example, it might enable an AI to drive more effectively in challenging conditions, such as snow or ice.
  • However, there are still challenges: New training methods are needed, the technology doesn’t perform well with tabular data, and it can be slower than traditional models when handling large datasets.

Quantum computing also promises to be more energy-efficient for specific applications. If it can deliver on its potential, quantum computing could significantly reduce the power consumption associated with traditional computing for certain tasks.

As with any cutting-edge technology, there’s still much to understand, develop, and refine.

But what’s exciting is that the problems of energy consumption and usage are already being tackled, and we’ve only just begun to explore the possibilities of overcoming these challenges.

?

?

Francois Barrault

Chairman, founder and owner FDB Partners SPRL, Chairman IDATE DigiWorld

1 个月

Dear Stéphane Dufour, thanks for your kind message and with Jack Hidary, we were very happy to have you at this lunch. It was indeed a great event with iconic speakers. See you next year !!

回复

要查看或添加评论,请登录

Stéphane Dufour的更多文章

社区洞察

其他会员也浏览了