AI robots with empathy?
Anna A. Tavis, PhD
Department Chair, @ NYU, SPS | Clinical Professor, Human Capital Management
A look at artificial intelligence built with the blueprint of human emotions (narrated by @Rachel Joshi of People Matter Magazine)
“Can artificial intelligence have empathy?”
A few brave souls raised their hands when 美国纽约大学 professor?Dr. Anna Tavis?asked her audience.
The more sophisticated AI tools today?don’t just simulate how humans think – they can also decode how humans feel.
Think of facial recognition software that can interpret a person’s sentiments based on their micro-expressions. Or chatbots that can infer a person’s mood while they are typing – based primarily on their keyboard strokes. These advancements signal the beginning of what Dr. Tavis calls the age of?emotion AI.
More than a co-pilot, AI is a confidant
No longer just a co-pilot in everyday work, this type of intelligence is designed to read the language of human emotions by looking into expressions and patterns of behaviour and reacting to them appropriately.
AI built with the blueprint of human emotions can serve as a task assistant, companion and even confidant for some. It's taking the leap from artificial intelligence to artificial empathy.
As Dr. Tavis, a clinical professor and chair of the NYU SPS Human Capital Management , puts it: emotion AI delves into what makes us fundamentally human and “different from those machines” – that is,?our jumble of emotions, desires, hopes, fears, insecurities and frustrations – broken down and translated into programming language.
Of course, most headlines about AI tend to pit humans against machines. Yet, the more optimistic stories often depict human capabilities – such as the capacity to feel, ruminate, and hope – as being far more sublime and superior to the computing powers of AI.
领英推荐
However, the world is also reaching a point in AI development where human emotions can now be simulated.
How robots are programmed to show empathy
“When we say empathy, we think, ‘I have to be walking in your shoes, understanding how you feel’. That’s one level of it: cognitive empathy,” said Dr. Tavis, who recently published her latest book,?The Digital Coaching Revolution.?
“But there’s another level of empathy where – based on how I see you behave – I can actually intellectually figure out how you feel. And that’s where the machines are going,” she said.
AI developers are “targeting the recognition of the signs of emotions and programming them into these tools. [Robots] don’t feel, but they act like they feel. They don’t understand what you’re going through. However, they can respond in a way that would look like they do,” Dr. Tavis said.
In fact, one branch of computer science – called affective computing – specialises in the development of machines, especially AI systems, that can “look at this language, recognise, understand, and simulate human emotion”.
The point of affective computing is to advance the well-being of humans by?mapping out their emotions?more precisely.
After all, “how can a machine effectively communicate information if it doesn’t know your emotional state?” Dr. Tavis asked. “If it doesn’t know how you feel, and if it doesn’t know how you’re going to respond to specific content? That is the question that is in front and centre of people who are designing this new technology.”
AI tools today are no longer about?making us more efficient or productive. “That is the price of admission,” Dr. Tavis said. Nowadays, “they’re targeting something very sacred to who we are as humans.”
Corporate Communicator
8 个月Learn How Artificial Intelligence (#AI) Is Changing #Robotics Get the PDF Guide: https://tinyurl.com/2s3e6pud
INTERNATIONAL MARKETING EXECUTIVE New Product Developer: Consumer Goods, Tech, Financial Services Industries
9 个月Compelling insights and it highlights the need for Legal Guidance(s) around this impact!
"Machines and programs are not sentient! They may appear to be sentient, but that’s not the same." Torben Riise
Creator of Coaching 5.0 | Industry 5.0 Training | AI Enhanced Team Building & Employee Flourishing | Clarifying Policy on AI, Ethics, Diversity & Regulation | TEDx speaker on Mental Health AI/VR Visualisation+Guidance.
9 个月Will machines ever provide genuine, authentic, ‘felt sense’ empathy? A major perspective I’ve emphasised about the machines supposedly engaging with emotion, is they are simulating not emulating, inferring not detecting, making maps not meaning, calculating correlations rather than establishing causation. They are limited by the stigma and bias that developers have around exploring their own mental health. Until human beings compensate for stigma around mental health, how on Earth can they have the fidelity of compassion necessary to embody empathy with sufficient sensitivity and depth? If developers are afraid of dealing with their own demons how can they have the necessary integrity and lived experience to encode and embody empathic features into algorithms, models and the datasets needed to help other human beings address their shortcomings of agency, courage and self-esteem? My upcoming talk, posted on yesterday, will be the first exploration on the mysterious link between Social Stigma on Mental Health and how it might be limiting innovation to expand AIs capacities and capabilities. I’ll look at, not just how coaching is being reshaped by AI, but how Coaching may revolutionise AI if it addresses that Social Stigma.