How the New Breed of LLMs is Replacing OpenAI and the Likes

How the New Breed of LLMs is Replacing OpenAI and the Likes

Before diving into the architecture of new LLMs, let’s first discuss the current funding model. Many startups get funding from large companies such as Microsoft, Nvidia or Amazon. It means that they have to use their cloud solutions, services and products. The result is high costs for the customer. Startups that rely on vendor-neutral VC funding face a similar challenge: you cannot raise VC money by saying that you could do better and charge 1000x less. VC firms expect to make billions of dollars, not mere millions. To maintain this ecosystem, players spend a lot of money on advertising and hype. In the end, if early investors can quickly make big money through acquisitions, it is a win. What happens when clients realize ROI is negative, is unimportant. As long as it does not happen too soon! But can investors even achieve this short-term goal?

The problem is compounded by the fact that researchers believe deep neural networks (DNN) are the panacea, with issues simply fixed by using bigger data, multiple transforms to make DNN work, or front-end patches such as prompt engineering, to address foundational back-end problems. Sadly, no one works on ground-breaking innovations outside DNNs. I am an exception.

Read the full article, including documentation and Python code about my free energy-efficient, open-source specialized multi-LLM (local implementation, enterprise version), here .

went through the article , i can say the breed is evolving

Garth Bond

Senior IT Programme | Project Manager | Product Manager | IT Consultant | Change Evangelist | Entrepreneur | Delivering Innovative Solutions with Agile Methodologies

7 个月

Fascinating article Vincent. I have been pondering about this for a while but more from a bus point of view and where this will end up so that startups, SME ‘s could benefit from AI without the current huge investment required. Don’t know the answer hence my interest in your work.

Distributed network of LLM's work as super minds of Network better than Big GPT's , Gemini's ?

Neil Gentleman-Hobbs

A giver and proven Tech Entrepreneur, NED, Polymath, AI, GPT, ML, Digital Healthcare, Circular Economy, community wealth building and vertical food & energy hubs.

7 个月

Great piece but this shows how we defy all these models because smartR AI deploy on prem with rarely any need for upgrades. This ticks the green private.... and a buy to own solution your in house team can operate in silo across the enterprise

Saniat Sohrawardi

Ph.D. Researcher @ RIT | DeFake Project Lead | Usability of Digital Media Forensics, HCI, ML, Cybersec and Ethics

7 个月

"The problem is compounded by the fact that researchers believe deep neural networks (DNN) are the panacea, with issues simply fixed by using bigger data, multiple transforms to make DNN work, or front-end patches such as prompt engineering, to address foundational back-end problems." I mostly agree and this really resonates with me, many of us have been talking about how prompt-engineering is a bug in the system and not a career path. Moreover, the whole prompt-based interface the way it is now, is definitely not the endgame. Having said that, I can't hate on researchers and industry devs focusing on prompt engineering as it will help us understand various ways we can improve these interfaces.

要查看或添加评论,请登录

社区洞察

其他会员也浏览了