LLMs as User Interfaces
There are many ways to think about Large Language Models, AI and conversational agents like ChatGPT, Claude or Gemini. I am increasingly convinced that they are in fact, a new kind of user interface, rather than a new kind of computer.
You can think of types of computers as devices which add new capabilities to humans. For example, a cell-phone is a new kind of computer because it lets us be connected at all times, with everything that results from that: from the ability to capture content, to the ability to route us... etc. Blockchains are also a different kind of computer because they let us collaborate without having to know or trust each other.
On the other end, user-interfaces are not really increase our individual capabilities, but just make our communication with machines much more efficient. For example, the web-browser is an interface that lets us interact with "remote" computers (aka servers). The desktop analogy of the PC-era is one that greatly simplified our interaction with hard-drives or single-user applications running on them.
I would argue that Large Language Models are not really adding new capabilities, but improving our ability to leverage our devices. Whether it is on the phone, using voice, a chat bot, or, even more deeply integrated in applications like photo-editors, the AI, by "understanding" (please, note the ") our language very significantly reduces the time it takes to transfer an idea to the computer. For example, if I had to "draw" the illustration of the blog post, it would take me hours. With a carefully crafted prompt, I can do that in seconds. Similarly, the ability that these LLMs have to synthesize information, wether it is from a single source, or from many different ones, is an amazing accelerator in the transfer of information from the computer to our brains.
It's easy to imagine a world in which every single device will have its own AI (and this was apparently the theme at this year's CES), hopefully making our experiences with all of the machines of our lives smoother and more intuitive.
In that vein, it is even possible to imagine that the Operating Systems will increasingly be specific to their user, moving away from the one-size-fits-all interfaces. Who needs shortcuts when the computer understands what you want to achieve next? You can already see that in the ability for LLMs to just switch languages based the one you used for the question, without really asking you what language they should use (or you checking a box in some settings!)
领英推荐
AI / NLP software engineer
2 个月I totally agree Julien Genestoux And can’t help myself sharing this 1y old post of mine ?? https://www.dhirubhai.net/posts/dpeelman_gpt-intelligenceartificielle-hemelopse-activity-7128701339060056064-f9PE?utm_source=share&utm_medium=member_ios
Senior Partner, Financial Services
2 个月On point Julien Genestoux . As always:)