Insight of the Week: The 90:10 Rule in Generative AI

Insight of the Week: The 90:10 Rule in Generative AI

By Kerry Robinson

In a recent post on LinkedIn Dan Miller of Opus Research predicts that when it comes to AI in customer services, 2024 will see us go from Deer in Headlights (DIH) to Do it Yourself (DIY).


We will, he claims, "learn to pick and choose to incorporate it into the critical path between companies and customers, agents and customers and companies and the mass of knowledge they've created"


Miller deftly calls out the three top-level use cases for generative AI:


  • Automation (companies and customers)
  • Assistance (agents and customers)
  • Knowledge (companies and the mass of knowledge they've created)


But it's the DIY bit I want to call out.


There's no doubt that generative AI, and its availability via simple to use, pay as you go APIs has democratized AI. Literally everyone, including my Mum, can and will use Generative AI to do cool stuff.


But there's a big gap between 'cool stuff' and demonstrable ROI.


Between the first 90% of a Generative AI program, which is easy. And the final 10% which is extremely difficult and somewhat uncharted territory.


The easy bits are:


  • Prompting LLMs like ChatGPT to create, translate and reason.
  • Building GPTs or Open AI assistants that let chatbots leverage your APIs and pull knowledge from your documents.
  • Using APIs to extract intents, sentiment, and summaries from interactions.


That stuff is easy, and FUN! If you haven't played with these things already, stop reading now and go to chat.openai.com and get started with chatGPT. Once you've had a little fun there, upgrade to a Plus account, which will give you the extra power of GPT4, and also allow you to build your own GPTs with a simple interface. If you don't feel you're getting the benefit of the $20 per month fee, it means you're not using it enough. Remember, this stuff is Electricity for Knowledge work... you wouldn't turn up to work without a laptop and an internet connection. Let alone without access to electricity!


Anyway, that's the easy bit. Not to diminish it. You really have to touch and feel and get to know the capabilities of Generative AI.


But deployment. That's something else. That's the hard bit. Here's a hit-list of 10 of the 101 things you'll need to be thinking about:


  • Which API are you going to use in production? Can you get sufficient usage allowance from your provider?
  • How will you optimize for speed, cost and performance? How will you handle the variable cost of each API call?
  • How will you do quality assurance? Remember, this AI isn't scripted, so unit testing is… interesting!
  • How will you evaluate performance? Again, it's not scripted, so performance evaluation is… different!
  • How will you regression test? Making one thing better often makes something else worse! And remember, QA and eval are… interesting & different!
  • What happens when the model provider updates the model version. It won't produce the same outputs. You need to regression test again!
  • How will you conduct adversarial testing? You need to test for jailbreaking and off-topic discussions!
  • How will you get your security and compliance team happy? Will your regulator even allow this?
  • What if the API goes down? What's your fall-back plan?
  • Who’s going to support this? Does your support team have the depth of knowledge required? Or are you OK getting woken up at 3am?!?


There's another 91 of these to consider, and another 1001 after that. Generative AI has to be one of the most accessible technology innovations, and the easiest with which to build prototypes and demos. But getting from prototype to production, that's as hard as it ever was. In fact, it's probably a lot harder than it ever was. Don't let that put you off. Over the next decade or two, this stuff is going to eat the world, and your industry, and your business, just like software ate the world over the last decade.


So get prototyping. Enjoy the exhilaration of the first 90%. But don't under-estimate that last 10%.


Kerry Robinson is an Oxford physicist with a Master's in Artificial Intelligence. Kerry is a technologist, scientist, and lover of data with over 20 years of experience in conversational AI. He combines business, customer experience, and technical expertise to deliver IVR, voice, and chatbot strategy and keep Waterfield Tech buzzing.

Subscribe to Kerry's Weekly AI Insights

要查看或添加评论,请登录

Waterfield Tech的更多文章

社区洞察

其他会员也浏览了