Maybe AI doesn’t need to feel your pain to help resolve it

Maybe AI doesn’t need to feel your pain to help resolve it

As generative AI improves, it raises inevitable questions about what’s going on behind the screen. Do models like ChatGPT understand what they’re saying? Are they capable of sentience?

These questions came up again following the publication of a study that showed ChatGPT’s responses to health questions on a Subreddit were rated as empathetic 45% of the time — compared with just 5% for responses from real doctors.

This single study could be misleading for all kinds of reasons. And even if a chatbot sounds empathetic, does that mean it really is?

“A chatbot is really just based on a large language model — and large language models don’t actually care about you,” says Simon Corston-Oliver , Director of Machine Learning at Dialpad. “But it can certainly sound like it does.”

Large language models (LLMs) work by identifying patterns in how words and phrases relate to each other, and then making predictions about which words should come next.

So the question of whether or not such a system really “has empathy” is a pretty open and shut case. (On a related note, a Google engineer was fired from the company last year after claiming that an AI he had been working on was sentient.)

Maybe a more important question is: what are the implications of AI being able to demonstrate empathy? After all, you could argue that the ability to appear empathetic is what matters in human interactions — not what someone is actually feeling.

Take customer interactions. Is the agent you’re on the phone with really feeling your pain after hearing that your order hasn’t arrived or doesn’t work? Who knows. But if they tell you that they understand your frustration — and their actions reflect that — there’s a much better chance the call will go well.

Even people in highly empathetic roles, such as counselors, are taught techniques that help them signal empathy, such as paraphrasing what the client has said back to them.

What’s interesting is that as humans interact more with chatbots in their everyday lives, it’s becoming clear they want a human-like touch and manners, even if they know they’re talking to a machine.

“People interact with computers in similar ways to how they interact with people,” says Simon. “A lot of people like to say ‘please’ and ‘thank you’ to [Amazon virtual assistant] Alexa. They don’t like to interrupt her if she’s going on too long. Many of the same politeness strategies we use with humans we use with computers, and we expect the same thing back.”

And despite lacking genuine empathy themselves, AI chatbots can help humans communicate more empathetically.

Researchers at the University of Washington developed an AI system that suggested changes to messages on a peer-to-peer mental health support platform to increase empathy. They found that 70% of users felt more confident in providing support after using the tool.

Similarly, used in interactions with clients, an AI chatbot can suggest responses to a human agent who chooses which one to send, with or without modifications. As we already know, human intervention is especially important in settings such as healthcare and insurance, where the consequences of a wrong answer can be serious.?

“There's a need for a lot of caution because we all know that chatbots can just say things that simply aren’t true,” says Simon, referring to the problem of generative AI tools making up “facts” in their responses to questions.??

That’s why — for now, at least — it’s still essential to have humans somewhere in the loop, he says. Any feeling of empathy risks being quickly lost if a chatbot starts giving out incorrect or irrelevant information. “Part of empathy is knowing which customer problems an AI can’t deal with and making sure those get routed to a human.”?

For many businesses, it can be tempting to over-rely on AI in conversations with customers and remove humans from the equation altogether. But it’s important to first have a thorough understanding of both AI’s power and limits when it comes to empathy.

That said, AI is improving rapidly in its ability to mimic empathy and other aspects of human communication, making it an increasingly valuable tool to deploy on the front lines of #CustomerExperience.

Adam Dolin

Searching for new opportunities.

1 年

Wow Reddit of all places is 40% more empathetic than my doctor! I mean I can’t say I’m that shocked I guess.

CHESTER SWANSON SR.

Next Trend Realty LLC./wwwHar.com/Chester-Swanson/agent_cbswan

1 年

Something To Think About ?? ??.

要查看或添加评论,请登录

社区洞察

其他会员也浏览了