Waiting for GPT-5
Artwork by Andre K. George

Waiting for GPT-5

When GPT-1 debuted in 2018, it barely caught my attention due to its technical capabilities being far removed from my practical needs.

With the introduction of GPT-2 in 2019, I experimented with it for writing a sci-fi story, but the outcome left much to be desired. Take this excerpt about the main character, Adam, for instance:

“Adam is buried in the military cemetery on the planet Mars, alongside his family and friends. It was a fitting end for him, and he was ready for his new life. He is now an astronaut on a spaceship, and he never wants to return to Earth. Adam is one of the lucky ones, and he is happy with his new life. He has no more plans of returning to Earth, but he has never been happier than the day he died.”

The lack of self-attention in the text rendered it nonsensical, reinforcing my skepticism about the practicality of GPT. However, the advent of ChatGPT in 2022 marked a turning point. The release of GPT-4 in 2023, with its leap in technical architecture (from 175 billion parameters in GPT-3 to 1.7 trillion in GPT-4), significantly enhanced the tool's capabilities. As a frequent user, I now rely on it for tasks like editing, summarization, translation, and even occasional Python coding.

This progress naturally leads to speculation about GPT-5. Anticipated in the near future, barring unforeseen circumstances, GPT-5 is expected to further advance in architectural sophistication, likely delivering even more impressive and human-like interactions.

While I'm preparing myself for its arrival, I suspect it will still manage to surprise and overwhelm. I expect remarkable improvements in human-like interactions, superior translation quality, enhanced mathematical and problem-solving skills, stronger coding, faster access to external data sources, and improved summarization abilities. Additionally, I look forward to expanded token capacity for handling larger documents and texts, seamless multi-modal capabilities for working with text, images, and videos, and an overall experience that feels more like interacting with a human than a machine.

We are indeed living in interesting times.

And then, there's the prospect of GPT-6.


Dmitry Tyomkin ????????

Head of Systems Integration, Cat Digital Platform at Caterpillar Inc.

10 个月

Is a new "Moore's law" coming on how fast LLMs can evolve? ??

Eli (Ilias) Farfouris

Strategic Sales AI/Analytics/IoT/Cloud/ ServiceNow

10 个月

Andrei Khurshudov, PhD Good for research and competitive intelligence as well. Thanks for sharing.

John Nicastro

CTO @ Major League Soccer || Sports, Media, Software, Telecom | Growth Accelerator | Patented Innovator | Speaker | Board Advisor

10 个月

...and GPT 7,8,9... seems the trends indicate that LLM performance is a fairly predictable function of # of parameters in the network and the amount of text we train on. Fair to assume we will see a lot more general capability across every knowledge domain by scaling. Great thought, thanks for article.

回复

要查看或添加评论,请登录

社区洞察

其他会员也浏览了