AI Insights - Is the Acceleration of the Power of AI Models a Recent Phenomenon?

AI Insights - Is the Acceleration of the Power of AI Models a Recent Phenomenon?

AI Insights - Is the Acceleration of the Power of AI Models a Recent Phenomenon?

The first AI model appeared in 1950, when scientists developed a robotic mouse that could navigate a maze and remember its path. Its computational capacity, measured in FLOPS (floating point operations, equivalent to addition, subtraction, multiplication or division, per second) was 40 FLOPS. In 2023, OpenAI’s generative AI model ChatGPT-4 had a computing capacity of 2.1 x 1025 FLOPs. Looking at another measure of power, the number of parameters used by AI models, the progress is even more meteoric in the last few years alone: ChatGPT-1 released in 2018 had 94 million parameters, ChatGPT-4, released this year, had nearly a trillion. Thus, as highlighted in the chart below, the number of parameters processed by an AI model has increased by a factor of... 10,000 times, and this in only 5 years.?

Number of parameters used by the main AI models since 2018, in billions (logarithmic scale)

No alt text provided for this image
Source: Nvidia, BofA, Edmond de Rothschild

The economic attractiveness of AI, and the privileged position of technology conglomerates in this field, could continue to drive the development of AI for economic and competitive purposes, with 23 the risk of a race to the bottom. In this respect, at the initiative of the Future of Life Institute NGO and following the publication of ChatGPT-4, nearly 1,000 scientists and experts in the field (including Elon Musk, co-founder of OpenAI) have called for a “temporary halt” to the development of AI models, fearing in particular that a mad rush to compete in the sector will have “harmful” consequences.

But is there a limit to this exponential acceleration in computational parameters and capabilities? There are several limits to the scale of AI models. According to an academic report published in October 2022, “the stock of high-quality language data will be exhausted soon; likely by 2026”, which will limit the expansion of models based on existing data. Another limitation is the cost of developing new AI models, and the power of the underlying technology (software, semiconductors, graphics processors, etc.). Furthermore, although hardware and especially chips closely follow and have even outpaced Moore’s Law (suggesting that computational power doubles every two years), AI models are expanding much faster in terms of power: they require on average 8 times more computational power every two years, and the most advanced up to 275 times every two years. This exponential growth in the capabilities of AI models and their computational power requirements, far surpassing that of the evolution of computing power as defined by Moore’s Law, thus presents a limit. For the time being, few technological innovations are expected that could revolutionise the field of AI in the way that graphics processors did in the early 2010s, for example, when machine learning AI experienced a remarkable expansion. In addition, the development of more advanced and powerful semiconductors may also slow down due to the technological and physical difficulty of developing smaller and more powerful chips.

However, one major technology in development could give AI growth a new boost: quantum computing. The far greater computing capacity of existing computers (the quantum computer developed by Google is 158 million times faster than the first existing non-quantum supercomputer) allows them to process massive amounts of data and solve complex problems much faster than conventional computers. Quantum computing offers several potential advantages: it can speed up the learning and training of AI models, thereby reducing the costs and resources required. Also, quantum computers could improve the performance of AI algorithms. However, quantum computing is still in its infancy, and the manufacture of functional, large-scale quantum computers is not on the cards. Despite these obstacles, quantum computing has considerable potential to revolutionise the development of AI and could provide a new impetus to this field of research and overcome existing computational limitations.

Impressionnant

回复
Alexandre Gaillard de Carouge

We Build Banks - We Automate Businesses with InvestGlass.com

1 年

The price of training is collapsing... The Third Industrial Revolution... it's today right ?

要查看或添加评论,请登录

社区洞察

其他会员也浏览了