On-Device LLMs Are Shaping Up To Be A Disruptive Threat To Semiconductor, Browser, Search and OS Incumbents

Here is something I am not sure anyone was expecting (self included):

With the release of Mistral's latest LLM, and the rise of technologies like Ollama and self-operating-computer, it is becoming clear that highly capable open-source LLMs can run on consumer PCs and may have significant repercussions for desktop software.

We are even seeing people running capable LLMs directly on phones.

It is still early, but the advancements in on-device LLMs appears to me to be among the biggest potential disruption opportunities and threats to Microsoft, Google, Intel, Apple, NVIDIA, AMD and others we have seen in many years.

This trend is likely to become a big deal in 2024 with unknown consequences.

Why On-Device LLMs May Threaten Major Software and Semiconductor Incumbents

Web Browser Disruption / Displacement: LLMs running locally on a client PC can scrape the internet, strip out all the advertising and give users only the results they want. No more spam SEO content or deceptive ads in search results. This is a big problem for Google and Chrome, as well as other browsers.

On-device, open source LLMs have the potential to become "Ad-Block" on steroids, spelling the doom for web advertising as a business model.

Semiconductor Platform Incumbent Displacement: Right now, hardware acceleration via platform advantages such as CUDA is partially due to the effort required to translate / optimize software written for one platform for other chipsets.

On-device LLMs of the future may be able to help accelerate optimization / migration / translation and act as "universal compiler / interpreters."

Things like OpenVINO and comparative offerings from AMD may experience streamlined translation across platforms, lowering developer effort to create software that behaves equally well across devices.

API / SaaS / Low-Code / No-Code Displacement: We are already seeing Ollama "functions" which offer the ability to trigger APIs directly from the desktop LLM without a web browser.

This means the ability to use human language to automatically trigger API calls from notable services dramatically may reduce the need to visit dedicated websites. Why go to Yelp if my LLM will give me all the restaurants in a custom built UI pane?

The need for a UI seems poised to erode steadily.

Universal Kernel / Middle-Ware: If Local LLMs continue to see traction, we may see them progressively eating more and more functionality of a typical operating systems. This has the potential to flatten / eliminate the distinction between chipsets.

Displacement of Operating Systems: Modern operating systems are mired in decades of technical debt which is often impossible to untangle or unwind. In the same way that Chromebooks have made an "end-run" around heavier-duty OS, LLM-OS may offer dynamically generated UI/UX which is created on-command via chat interface.

It is still too early to determine how this plays out, but it seems the potential is truly disruptive and whoever embraces it and runs with it will have a major advantage.

Timor Kardum

Co-Founder and Chief Creative Technologist at MAGIG Design + Technologies. Building CASSI - the no-code AI content generation platform on cassi.magig.de. Expert in GenAI, Virtual Production, VFX, XR. Speaker, Consultant

1 年

Word ??

回复
Jeremy Wallace

Head of IoT Solutions Architecture, Connected Compute @ AWS

1 年

This was a fantastic read : https://medium.com/@ronaldmannak/goodbye-windows-hello-llms-the-future-of-operating-systems-7ba61ea03e8d Interesting to see LLMs evolve into the ultimate operating system.

回复

要查看或添加评论,请登录

Rex St. John的更多文章

社区洞察

其他会员也浏览了