Conversational AI in the Enterprise -The Grinding of the Plates

Conversational AI in the Enterprise -The Grinding of the Plates

GenAI is causing a seismic shift in our industry.? The friction between the free-flowing conversations enabled by Large Language Models (LLM’s) and the structured needs of enterprise apps is like the grinding of the continental plates together, and we’re caught in the middle.? Lessons learned for Builders - part 2 of a 3-part series.? Part 1 provided a GenAI primer and looked at the strategic impact on our industry.? Part 3 will look at risk and trust.? I welcome your feedback and comments.

Part 1 had five key GenAI messages for decision makers:

  1. GenAI is driving the cost and time required to create things to zero.? Ask “how can I create better work products that matter to our business, at digital speed, while driving the cost of doing this to zero?”? Start there.
  2. Virtually every knowledge worker task begins with natural language and user intent, so every app will be affected.
  3. The chat interfaces democratizes AI and enables disruptors to test business hypotheses without writing a line of code.
  4. ChatGPT is so easy to use that it seems it would be similarly easy to build a real enterprise application using it. ?It’s not. We need a whole new stack and hundreds of technologies must be reinvented.? As must skills.
  5. If you are managing an innovation portfolio this is a target-rich environment.? You cannot stand at the edge of the pool, waiting for your competitors to dive in first. The new entrants to your field have already jumped in the water.

Here are seven things we have learned from building.

Large Language Models are brilliant?

We think of them as idiot-savants.? The craziest idea can be implemented with a few sentences of natural language and perform brilliantly. But the simplest thing, for example the FOR loop I learned in the first week of COMP SCI 100, cannot be made reliably.? An LLM is “algorithmically frail”.

We don’t replace what we have coded for 50 years.? We selectively leverage LLM’s for their unique, kryptonite capabilities, things that were inconceivable just a year ago, their linguistic supremacy, veneer of common sense and reasoning, infinite patience and hyper-personalization.

give it agency

We give the LLM both considerable responsibility and agency to accomplish a goal, within our orchestration.? Some app building approaches use the LLM as a “stochastic parrot” – a better Natural Language Processing (NLP) model.? Why lobotomize it? It was trained to do so much more. ?This traditional approach wastes the emergent skills represented in its parameter space. ?It is an incremental, 10X improvement, not a 100X-1000X disruption.

but don’t sweat the deterministic

My first calculator was an HP-29C and what excited me was you could write programs on it. But its in the calculator museum for a reason; every device I own comes with one built-in. ?Why would I try to do something similar with an LLM? The LLM is the star of the show, with the amazing talents. ?It does not need to play the bit parts too. ?You can try to make it do deterministic things, but it won’t like it, and its performance will be lackluster.

learn in-context

We never rely on general factuality of the embedded “knowledge” the LLM offers. ?Analogy: does the beach (the LLM’s huge parameter space) contain those few grains of sand (particular facts) that we need?? Should we back up a finetuning dumptruck to add a few million more grains of sand (relevant documents) to the beach?? Why am I looking on the beach anyways?

We do rely on the model using a broad veneer of knowledge and “common sense” to carry on conversation.? But we augment with curated knowledge, in context, whenever it is required, an In-Context Learning architecture.

set the data free

To resolve the friction between free-flowing conversations enabled by LLM’s and the structured, strongly-typed data that enterprises demand we’ve bridged the gap within our framework using a proprietary object store that brings the two continents together.

And we’ve unlocked a new data source in conversations and a tool to analyze it to “get in the mind of the individual”, assembling a Cognitive Digital Twin.? ?The power to surface data previously invisible in conversations will enable a new class of enterprise and societal app and insight.?

use your bench

An Olympic coach once described his players to me as “able to play many positions, but world-class in one”.? We have a full roster of players, each selected for its ability to efficiently demonstrate a specific LLM skill.? The current franchise player, the Large Foundation Model (LFM), is like a medieval trebuchet. ?It takes an army of hundreds of engineers to haul it into battle, and stockpile the huge boulders it flings at the city walls.? Many targets can be unlocked more delicately using smaller models, just as cannons and gunpowder replaced the trebuchet.

put a human in the loop

The persistent explainability problem has been accentuated, and we cannot explain how the new models function.? Most use cases will require a human in the loop, with the AI agent augmenting human ingenuity, not replacing it.? Thus we see the transition from autonomous, predictive AI autopilots, to collaborative, GenAI copilots.

wield the new hammer

We now have a tool that was inconceivable even 12 months ago. ?With emergent skills that the makers cannot yet explain.

You cannot analyze your way through this technological shift, the change is rapid and exponential.

You cannot stand at the edge of the pool, waiting for your competitors to dive in first. The new entrants to your field have already jumped in the water.

Use this new hammer to explore and start banging things in your business to see what rings, what resonates with your business leaders. ?Chat interfaces have democratized GenAI and made hypotheses very easy to test. ?Look for the glimpses of what might be inconceivable today, or what almost works but is not reliable yet. ?Invest on becoming conversant with this new era of technology and get ahead of the coming architectural shift. ?It is an important investment, as the impact of this new era will felt for the next 20 years.

Nice Larry. Keeping our eyes open to the potential, reminding us of the complexity, encouraging us to build and explore and learn along with some best practices to follow.

回复
Adelle Rewerts, UXC/R/M

Principal Consultant | User Experience Architect | Product Strategy

11 个月

The current moment feels very similar to when i started my career in 1999 (such a n00b compared to some !)... but there's a similar urgency, confusion, and global disruption. The difference is, I think, that we've evolved beyond simply absorbing the technology and understand the nuance required to solve the right problems, well, for and with humans. And I feel like that's the real question: how do we use this technology to amplify humanity?

要查看或添加评论,请登录

社区洞察

其他会员也浏览了