Open source approaches to Generative AI and LLMs

Open source approaches to Generative AI and LLMs

Off the back of my series on AI and more specifically OpenAI, I wanted to explore the age old problem in product management, which is Buy vs Build vs Partner.

When I come to look back on most adventures in my career, this is a problem I have continued to bump into, which I have probably leaned more towards build in the early days and partner more in recent times.

As companies race to embed (or even build around) AGI capabilities into their workforce tools, SaaS products and services, they will need to make decisions on which technologies will support their requirements and Open Source vs proprietary solutions will feature heavily in that decision making process.

But this "AI thing" isn't new right?

Correct, tools of this nature have been around for a number years and prior to the late 2022/2023 "AI Arms" race that we are all swept up in right now, there are a raft of technologies in the AI/ML space. Let's explore where they came from;

  • Creator teams from within large product organisations, such as Google Brain, Facebook AI Lab, OpenAI and Kaiser Permanente, who have built their own private platforms and in some cases launched them as libraries.
  • Libraries you can run yourself such as Tensorflow, Pyroch, Keras, some of which are open source and released by Creator teams.
  • Platforms such as Azure Cognitive, IBM Watson and Salesforce Einstein, some are proprietary or productised implementations of open source libraries
  • Products such as ChatGPT, Jacquard , Pecan AI , which can be a mixture of open source and proprietary. In many cases they are a unique productised mix of technologies, typically designed to solve a specific problem.

With the exception of the Products, the technologies above were mostly just that, a technology. Regardless of how you acquired the technology you were responsible for the data you put in (training data, operational data), the way you process it (models and algorithms), refine it (augmentation, fine-tuning) and the data you get out (prompting, response quality, accuracy) and so on.

You were also responsible for the hosting, infrastructure and security. This of course is a huge task, not lightly undertaken and always requires professionals.

Sometimes referred to as the "toolbox era", libraries and platforms in the earlier days were narrow in scope, task specific and in many respects, choose your own adventure. In the same way that jobs around the building site when building a house (your use case/product) require a concrete mixer and pump on one given day, the next job may require a paint sprayer! Whilst the ultimate goal is to get the house built, there is a lot of work to do between the architects plans and the built home.

The term AGI (Artificial General Intelligence), which in general refers to LLMs (Large Language Models), is where the toolbox in some cases is moving to more of a Swiss army knife type analogy.

What is the future of the "Toolbox" era?

Now, I am struggling to extend my tools and House building analogy here a little.

Sorry!

But, I guess it extends to the notion that some people are owner builders, some are off-the-plan home buyers and happy with the limited set of customisations and paint swatches, whereas some will just prefer to buy a ready built new home and add a few small tweaks later.

In reality we see this evolution model in most technology areas. The difference I foresee here is the accelerated jump to Platforms and Products. Which, in my opinion, is a healthy mix of maturity and natural apprehension of ethical AI use.

The toolbox strategy will continue to exist and in many respects get deeper, but the majority of those who have ambitions of AGI tools embedded within their products will need gravitate toward Platforms and Products and where the battle will be won in the next 12-18 months, is those Platforms and Products that can satisfy a difficult middle ground of Privacy, Security, Usefulness and Accuracy. I purposefully didn't add cost in, thats a problem for another post!

I get it, but I want explore doing Open Source and myself

Ok, well, in your quest to build, run and manage this yourself (which in fairness maybe a bona fide business strategy), using "tools your don't have to pay for", here are some of the Open Source options you may want to consider;

Disclaimer; this is a large area to cover, so I will focus on a few well-developed toosl, mainly around AGI technology in the LLM area that give similar capabilities to ChatGPT. If I have missed an important one, let me know in the comments.

There are many solutions being developed weekly and of the best resources to follow is Hugging Face – The AI community building the future.

I'll talk more about Hugging Face soon, but if you are considering getting under the hood on AGI, then sign up to their newsletter at Hugging Face (curated.co).

Happy AGI'ing.

Thanks for posting on #Linkedin - pikk.co.in

回复
Bharathi Sathya

Practice Lead - Enterprise Digital Services

1 年

Anthony Hook your posts are must read for me along with Tomas kucera's Posts ??

回复

要查看或添加评论,请登录

Anthony Hook的更多文章

社区洞察

其他会员也浏览了