The Uberisation of LLMs and GenAI Tools: A Cautionary Tale

The Uberisation of LLMs and GenAI Tools: A Cautionary Tale

We know that Generative AI and large language models (LLMs) are transforming industries, promising efficiency, cost savings, and innovation.??

However, a trend similar to Uber's market strategy is likely to emerge, posing potentially significant risks for businesses relying on these technologies.?

The Appeal of Affordable GenAI?

Businesses are initially drawn to GenAI tools and LLMs for their advanced capabilities, like natural language processing and automation, at relatively low costs. This appeal is especially strong for startups and small enterprises, aiming to streamline operations and enhance customer experiences affordably. This phase mirrors Uber's early days, where affordable rides quickly built a loyal user base. Companies integrate AI tools and enjoy increased efficiency and a competitive edge, but as their dependence grows, so does the potential for provider strategy shifts.?

The Price Hike Trap?

Similar to Uber’s fare increases after market dominance, GenAI tool providers may hike prices as businesses become more reliant on their technologies. What once were cost-effective solutions can turn into budget strains, especially for companies deeply integrated with AI.??

Transitioning to different providers or developing in-house solutions can become prohibitively expensive, leaving businesses effectively captive to their AI vendors, with significant long-term financial implications.?

Build vs. Buy: Strategic Choices?

The emerging trend necessitates a careful decision between building or buying GenAI solutions.??

Here are some key considerations:?

Initial vs. Long-Term Costs: Building in-house in some cases can require significant upfront investment but offers more predictable and controlled costs compared to potentially escalating third-party expenses.?

Control and Customisation: Proprietary GenAI tools can be tailored to specific needs, providing better integration and control.?

Dependency and Flexibility: External providers introduce dependency risks if prices surge, or strategic directions change. In-house solutions offer more autonomy and flexibility.?

Scalability and Innovation: While third-party tools provide advanced capabilities immediately, in-house solutions can be designed to scale and evolve with company growth, offering a competitive advantage.??

The "Uberisation" of AI tools and LLMs is a cautionary tale. The initial benefits can lead to financial burdens as pricing strategies change. Businesses must carefully evaluate their GenAI strategies, balancing immediate benefits with potential long-term risks and costs. ?

By considering goals, budgets, and risk tolerance, companies can effectively leverage GenAI while mitigating dependency and escalating costs.?

If you are thinking of implementing GenAI in your business, why not try our free GenAI Readiness Assessment and discover your score. Take assessment.

Paul Eastabrook

Product & Technology Leader ?Applying GenAI with Highly Experienced Agile Product Engineering ?Computer Science AI Degree, Psychology Post Grad ?Ex: HSBC, Barclays, RBS, DB, BNP Paribas, BAML, Visa, MWAM, LawDeb, Beazley

3 个月

Incredibly important point well made Chris Jones.?? I would add that going all in on a given solution at this stage may well prove a disadvantage shortly thereafter in terms of improving capabilities.? There have already been multiple stories of large companies building their own foundation models at vast expense, to then have open source models come out that surpassed those capabilities "out of the box". Same can be said of smaller fine-tuning efforts. Often carefully crafted agentic workflows and the routing of prompts to one or more SLMs/LLMs; whilst making use of in-context learning and RAG solutions is all that is required...?and easier to adjust. It also creates options for cost per token vs capability trade-offs, amongst other functional/non-functional constraints that can be better balanced. Companies should see these first solutions as the worst AI they are ever going to have with better solutions just around the corner.?Solutions being put in place today should be throw away or designed with the ability to "upgrade"/replace SLMs/LLMs. Context is of course everything, but agility in the above regard could be an edge or differentiator as the landscape unfolds... we are still very much at the beginning of this journey.

要查看或添加评论,请登录

社区洞察

其他会员也浏览了