Artificial intelligence: fashion or megatrend?

Artificial intelligence: fashion or megatrend?

Since Gottfried Wilhelm Leibniz work on binary system in 1703, ENIAC in 1945 and then IBM's first public computers in 1954, the history of computer science has been punctuated by remarkable progress in the design of hardware and software. The invention of the transistor in 1940 then the integrated circuit in 1958 made it possible to move from massive computers to microcomputers. Over the years, thanks to the increase in computing power and pioneering work such as the Turing test, software has continued to improve. From algorithms capable of solving simple problems in 1950, we moved in 2000 to machine learning algorithms capable of processing large amounts of data; it's Big Data era. Then in 2010, neural networks enabled significant advances in prediction and in areas such as image recognition and machine translation.

History of computer science

In November 2022, the launch of ChatGPT by OpenAI under wide media exposure enabled the boom in artificial intelligence after the “AI winter” period of 1980s. This growing interest, symbolized by a new era in the development of generative artificial intelligence, allows machines to learn, reason, decide and create new realistic data, such as images, texts, and sounds.

The history of computer science is undeniably a testimony to human ingenuity and curiosity; a succession of developments which have each marked their own eras. These dazzling advances have been made possible over the years thanks to improved algorithms, increased computing power and the abundance of data available to train AI models.

At a time when numerous publications are predicting a future shaped by artificial intelligence, should we really believe in a revolution that will completely disrupt our economies and our societies?

As with any innovation or new technology, artificial intelligence is generating great enthusiasm. Health, finance, technology… the opportunities and areas of application are numerous.

The companies that invest in AI are currently seeing their stock prices shudder according to the related announcements.

Google shares drop $100 billion after its new AI chatbot makes a mistake in February 2023.

OpenAI valued at $86 billion

A bit like during the gold rush, it is very likely that the future AI billionaires will not be service providers (gold miners), but computer manufacturers (pickaxe sellers)

But all these applications enabled by Artificial Intelligence algorithms are not possible without four important parameters: data, computing power, telecommunications networks, and energy. Data is essential for computers to learn. Applications using AI are computationally intensive and require hardware capable of optimizing billions of parameters over almost infinite data streams. High-speed networks allow the transport of large flows of data from one source to another. And large amounts of energy are required to run computers, networks, storage, and data processing algorithms.

Therefore, it’s not surprising that the big winners in AI on the stock market today are hardware suppliers (especially semiconductors) like NIVIDIA. A bit like during the gold rush, it is very likely that the future AI billionaires will not be service providers (gold miners), but computer manufacturers (pickaxe sellers).

NIVIDIA stocks since 5 years

For other companies not directly involved in the field of AI, the economic opportunities and risks remain difficult to assess. They could win by benefiting from the late mover advantage effect, but also lose by missing out on a technological revolution.

Besides the large companies currently involved like Nividia, ASML, TSMC, Samsung, Microsoft, Alphabet, Meta and Synopsys, many others like IT service providers could benefit from AI. The use of artificial intelligence could significantly boost business productivity and sustainably shape the future of the economy and society. But this will depend on our ability to provide plenty of quality data to AI algorithms, have clean and cheap energy, and above all produce powerful super computers. On this last point, quantum computing could be an important catalyst.

The Artificial Intelligence market is far from mature. Despite its current hype and growth, many paths remain to be built. Thus, we believe that artificial intelligence is much more than a passing fad; it is a trend expected to last, certainly with less enthusiasm, but in continuity with the history of computer science. A succession of technological improvements written over several centuries: internet, big data, cybersecurity, AI, etc. An intelligent construction made of innovation and progress: transistors, radars, microprocessors, computers, smartphones, supercomputers, etc.

In our fast-moving world, it’s most likely that the next big thing won’t be AI, but what we’ll do with it!

And you what do you think about AI?

References

https://www.blackrock.com/corporate/insights/blackrock-investment-institute/publications/outlook

https://www.blackrock.com/corporate/literature/whitepaper/bii-global-outlook-2024.pdf

https://kingland.fr/trivia/lintelligence-artificielle-en-mode-revolution-innovante-ou-simple-tendance-ephemere/

https://www.computersciencezone.org/evolution-of-computer-science-infographic/

https://www.reuters.com/technology/openai-talks-sell-shares-86-billion-valuation-bloomberg-news-2023-10-18/

https://www.bloomberg.com/news/articles/2024-02-17/openai-deal-lets-employees-sell-shares-at-86-billion-valuation

https://www.businesstoday.in/markets/global-markets/story/google-loses-over-100-billion-m-cap-after-chatbot-bard-gives-wrong-answer-in-ad-369572-2023-02-08

https://www.npr.org/2023/02/09/1155650909/google-chatbot--error-bard-shares

https://edition.cnn.com/2023/02/08/tech/google-ai-bard-demo-error/index.html

https://finance.yahoo.com/quote/NVDA/

Joel Patrick NDZIE

Payments Solutions Specialist | Java, Go backend Developer | Solution Architect | AWS | ISO 8583 | EMV

10 个月

Saved to read on sunday night ??

要查看或添加评论,请登录

Jonas Adrien A.的更多文章

社区洞察

其他会员也浏览了