Does #deeplearning mirror evolution?

Does #deeplearning mirror evolution?

Sam Altman has proposed a fascinating article called the The Intelligence Age

which says (about significance of AI)

In three words: deep learning worked. In 15 words: deep learning worked, got predictably better with scale, and we dedicated increasing resources to it.small , simple changes over millions or billions of years for intelligence to emerge

I don't want to go into the emergence of AGI (or the timelines thereof) since I am not involved at that level of research

However. I want to point out an important aspect why AI / AGI are very significant

In the early days of my newsletter I published this (which is in my view the greatest mystery of AI as we know today)

Can intelligence emerge on it's own if we do nothing else but keep building larger models based on simple components?

references and links below

That itself was based on an MIT Technology review article called "Artificial intelligence 2021 was the year of monster AI models" by Will Douglas Heaven ?

The title of this MIT article was actually (in my view) simplistic because it raised a profound question

here are some quotes and you will see why (emphasis mine)

1) GPT-3 grabbed the world’s attention not only because of what it could do, but because of how it did it. The striking jump in performance, especially GPT-3’s ability to generalize across language tasks that it had not been specifically trained on, did not come from better algorithms (although it does rely heavily on a type of neural network invented by Google in 2017, called a transformer), but from sheer size.

2) “We thought we needed a new idea, but we got there just by scale,” said Jared Kaplan, a researcher at OpenAI and one of the designers of GPT-3, in a panel discussion in December at NeurIPS, a leading AI conference.

3) “We continue to see hyperscaling of AI models leading to better performance, with seemingly no end in sight,” a pair of Microsoft researchers wrote in October in a blog post

4) The trend is not just in the US. Both China and South Korea showed the same result

5) Yet despite the impressive results, researchers still do not understand exactly why increasing the number of parameters leads to better performance.

Hence, the question

Can intelligence emerge on it's own if we do nothing else but keep building larger models based on simple components?
After all, that's exactly how intelligence and life evolved on earth - ie by small, simple changes over a duration of time.

If so, then we are only at the beginning of what #deeplearning can do. Irrespective of the AGI question, we are on the cusp of a cambrian explosion in new capabilities and services for people.

My article - https://www.dhirubhai.net/pulse/artificial-intelligence-42-can-emerge-its-own-we-do-nothing-jaokar/ ?which was based on an article from MIT tech review - ?https://www.technologyreview.com/2021/12/21/1042835/2021-was-the-year-of-monster-ai-models/ ? ?

Trupti Wagh

Lead/Staff Software Engineer | Spirited Entrepreneur | Innovator

2 个月

Fascinating!

Amita Kapoor

Author| AI Expert/Consultant| Generative AI | Keynote Speaker| Educator| Founder @ NePeur | Developing custom AI solutions

2 个月

In my opinion, the simple fact that genetic algorithms work, prove that intelligence can emerge by itself.

要查看或添加评论,请登录

Ajit Jaokar的更多文章

社区洞察

其他会员也浏览了