Some perspective on 'natural'? conversational AI

Some perspective on 'natural' conversational AI

So here I am. Back after a two week detox. A proper detox, where I've hardly lifted my phone, haven't opened LinkedIn, Instagram, YouTube, Gmail or any other app you might consider 'important'.

What I love most about taking a break is the renewed perspective you can develop when you have the chance to come up for air.

On returning, I saw this post from Irakli Beselidze :

Irakli asks "What's missing in conversational AI that'll enable us to have truly natural conversations?"

My first reaction was... LOTS.

This is a round about way of getting to the point, but bare with me:

We (the collective 'we' i.e. humanity) don't know what creates consciousness. We don't really know what it is or where it comes from.

With consciousness comes thoughts. And where do they come from? How do they enter (or emerge within) the brain? We don't know.

And so, what is language? Language is simply a way of us communicating the thoughts we're having, isn't it? It's what we use to process conscious thoughts. Can you think a conscious thought without using language, without talking to yourself in your head? Not according to Steven Pinker .

Spoken language, then, is simply verbalising the stuff of thought.

So, for us to develop a truly 'natural' conversational AI engine, we need to develop something that can think. And if thinking is a byproduct of consciousness, we need to develop that, too.

The human brain has about 85 billion neurons. No one really knows how they all interact with each other. Not really. A 230,000 cell virtual simulation of a mouse's brain is the best we can do (and that's really good). But we're miles away from understanding the 85 billion that make the brain tick.

And it's these 85 billion neurons that, when working together in whatever way they do, create our consciousness, our thoughts and, ultimately, our language.

But it's more than language. Language is the output. In the same way as these words you are reading are the output of html, CSS, javascript, python and a bunch of chips and silicone all working in harmony to display this screen on your device, so too our language is the output of those 85 billion neurons working in harmony to produce thoughts that you verbalise.

The two things we humans have above all other lifeforms that we know about is: language and the ability to think about the future. Add to that immense memory and you have the beginnings of why we're so advanced (relatively speaking) communicators: we can instinctively think about the past, and imagine the future, and articulate it all with great precision, together.

Most of the 'AI' that most of us use to create most of our assistants are basically matching a spoken word to a string of text, then matching that string of text to a bucket full of similar enough strings of text that we've assigned as an 'intent'. The 'dialogue management' is left to the conversation designer in most cases. And in most cases, the conversation designer is creating a few 'branches' of options that exist at every turn.

Is that how those 85 billion neurons work? Is that how conversations go?

Until we can fully understand how we humans think, we'll be in the land of rule-based pre-loaded request-response AI. Which, I might add, works pretty well and good enough for what you want to do with it.

The fact that we have technology that can listen to someone speak, translate that into text, then work out the broad meaning of that text so that you can understand what to do next is absolutely f*cking amazing. And while I'm excited (and scared) to see where it all goes, I don't think 'natural' conversations are needed for most use cases that matter today.

You have all you need. Go build.

---------------------------------------------------------------------------------------------------------------

About Kane Simms

Kane Simms is the front door to the world of AI-powered customer experience, helping business leaders and teams understand why voice, conversational AI and NLP technologies are revolutionising customer experience and business transformation.

He's a Harvard Business Review-published thought-leader, a top?'voice AI influencer'?(Voicebot and SoundHound), who helps executives formulate the future of customer experience strategies, and guides teams in designing, building and implementing revolutionary products and services built on emerging AI and NLP technologies.

Find your level of conversational AI maturity

Take my free CAI maturity assessment

Сonsciousness is the understanding of one's own and others' goals. Only apes have such abilities. The rest of the animals act instinctively. Their only goal is to survive. And not by accident, their communication abilities are also limited to a dozen genetically fixed signals like fear, joy, prey, danger, etc.? Language has made humanity the dominant species on the planet because it allows us to plan, coordinate and achieve goals that other species cannot even imagine. So there are two main functions of the language Get things done by coordinating the efforts of many? Social. Connect individuals more profound to maintain large groups of people for a long time - developing culture.? In the context of conversational AI, we are considering language as an interface to help individuals get things done. The main challenge is that machines are still worse than dogs in absorbing and interpreting unstructured data. And with the current command(intent)-based natural language processors, we are limited to 85% accuracy because the machine will always try to find the command in the utterance, which is not always the case.

Jeff Kinsey, Jonah

Strategic Business Services. EV Maven. TURO Maven. BuySellTrade4EVs . com. 10,000+ Hours EVSE & EVs. Entrepreneur, Author & Educator. Publisher: Print, eBooks, Mags & Apps. USMC Veteran. #IDme

2 年

Welcome back. In 400 years Commander Data will be a “one off” discounting his brother. So, maybe in 200 years we can make some real progress.

要查看或添加评论,请登录

Kane Simms的更多文章

社区洞察

其他会员也浏览了