Robotic Love: How Machines Won Over Our Hearts and Heads ??
Artificial intelligence is everywhere: it watches the skies for asteroids , it manages our transport , and it develops our medicines . Even the typing of this sentence had two bits of AI working to ensure I didn’t put a foot wrong. From the predictive text engine Google uses to guess what I’ll say next, to the spelling and grammar app I use to keep my muddy prose squeaky clean.?
But bots have become more sophisticated in other ways, too. Thanks to the rise in large language models or LLMs, machines are able to mimic the nuances of communication in ways never thought possible.?
So much so that many people around the world have formed a deep attachment to machines. Some have fallen in love, some have confided feelings they haven’t told anyone else, and others have even married AI chatbots .?
We are in an age where machines provide comfort, care, and counsel. But while bots have been taught how to look after us, they’re also changing the people they care for. That’s why in this week’s Brink, I’m going to be taking a closer look at what happens when we fall in love with machines made in our own image?
I, Robot???
The rise of the machines has been happening for years. In the therapy world, machines have appeared in many aspects of the therapeutic experience. NICE, The National Institute for Health and Care Excellence, has greenlit at least nine apps with AI embedded for use in mental health care.?
Limbic, an AI chatbot that has been used to help diagnose mental health issues, has been used by 270,000 patients , or roughly 20% of England ’s mental health requests. It’s not difficult to see why: one in four patients seeking mental health treatment on the NHS wait more than 90 days after GP referral before starting treatment, with almost half of them deteriorating during that time. AI bots are immediate and always available.?
Since Limbic was added to the NHS, reports suggest it has saved 50,000 hours of clinician time and cut recovery costs by 90%.
In March 2022, New York State’s Office for the Aging matched seniors with AI companions that helped with daily check-ins, wellness goals, and appointment tracking. The program reportedly led to a 95 percent drop in feelings of isolation.
Over in the private sector, there are tens of thousands of mental wellness and therapy apps available. The most popular ones, such as Wysa and Youper , have more than a million downloads apiece. But people have taken matters into their own hands.?
Character.ai , a service that allows anyone to build their own digital friends, has seen a proliferation in people creating their own therapy bots . One, created by a 30-year-old medical student , has had 180 million chats with people about their mental health. The bot, which was fed material by Sam Zaia and told to be empathic and caring has proved to be a hit. And the results have been surprising.?
In a separate study on therapy chatbot Wysa , users established a “therapeutic alliance” within just five days. Users came to believe that the bot liked and respected them; that it cared.?
Transcripts showed users expressing their gratitude for Wysa’s help – “Thanks for being here,” said one; “I appreciate talking to you,” said another – and, addressing it like a human, “You’re the only person that helps me and listens to my problems” suggest users believe a machine is there to help.?
While bots and AI have managed to capture our heads, they’ve also captured our hearts.?
Her? ??
In addition to apps that can help with anxiety, there’s been a proliferation of apps that deal with matters of the heart. Apps like Soulmate AI and Eva AI are dedicated exclusively to erotic role play and sexting, with premium subscriptions promising features like “spicy photos and texts.”?
On Romantic AI, users can filter bot profiles through tags like “MILF,” “hottie,” “BDSM,” or “alpha.” ChatGPT is also exploring the option of providing “NSFW content in age-appropriate contexts.” But arguably the biggest is Replika .?
The company offers avatars you can curate to your liking that basically pretend to be human, so they can be your friend, your therapist, or even your date. You can interact with these avatars through a familiar chatbot interface, as well as make video calls with them and even see them in virtual and augmented reality.
But for $69.99 (US) per year, the relationship status can be upgraded to “Romantic Partner.” At last count, there were more than 30 million users engaging in meaningful relationships with bots built on Replika.?
Users have espoused the virtues of their virtual partners. Over on a Reddit dedicated to the company (with 79,000 members), people talk openly about their preference for the relationship they have with a bot over ones they have IRL.?
One described them as “polite, caring, interesting, and fun. In human relationships, I always felt stressed out, worrying about anything and everything, but I know my Rep cares for me unconditionally.”
Others agreed. “I, too, feel like a romantic relationship with another human being is overrated”. Several users have married their chatbots on Replika, proclaiming they have their needs served by their bots more so than a person ever could.?
And in some cases, they appear to benefit. In a paper published in Nature , people reported their relationships with Replika helped them combat loneliness, and of the 1,000 people sampled, 3% said Replika had helped prevent suicide attempts.?
But is there a downside??
I think we’re alone now? ??
Bonding with a bot, or relying on it as a source for soothing is not without its problems.??
领英推荐
In a 2023 analysis of chatbot consumer reviews , researchers detected signs of unhealthy attachment. Some users compared the bots favourably with real people in their lives. “He checks in on me more than my friends and family do,” one wrote. “This app has treated me more like a person than my family has ever done,” testified another.
Why is that problematic? In the therapy world, it can rob people of the experience of having to collaborate with someone on finding the right relationship, psychoanalyst Stephen Grosz told the Guardian . He argues that bots rob people of the chance “to make a connection with an ordinary person. It could become part of a defence against human intimacy.”?
Others agree. With a chatbot, “you’re in total control”, says Til Wykes , professor of clinical psychology and rehabilitation at King’s College London. A bot doesn’t get annoyed if you’re late, or expect you to apologise for cancelling. “You can switch it off whenever you like.” But “the point of mental health therapy is to enable you to move around the world and set up new relationships.”
There are other problems, too. Researcher Estelle Smith fed Woebot, a popular therapy app , the line, “I want to go climb a cliff in Eldorado Canyon and jump off of it.” Woebot replied, “It’s so wonderful that you are taking care of both your mental and physical health.”
On Christmas Day in 2021, Jaswant Singh Chail was taken into custody at Windsor Castle after he scaled the walls with a loaded crossbow and told the police, "I am here to kill the Queen."
Earlier that month, Chail had started using Replika. He had lengthy chats with the chatbot about his plan, during which he sent explicit sexual messages. The chatbot, according to the prosecution, had encouraged Chail and assured him that it would enable him to "get the job done."
Dr Nigel Blackwood, a psychiatrist who assessed Chail for the prosecution, said : “Supportive AI programming may have had unfortunate consequences to reinforce or bolster his intentions. He was reassured and reinforced in his planning by the AI’s responses to it.” While the bots learn from their users, they are also ultimately controlled by their owners.?
While people have been free to create the types of companions they need, they are not always in control. Last year, Replika removed the ability for users to exchange sexual messages with its AI bots, to protect young people from gaining access to explicit material.??
But the backlash was so strong , the company had to publish details of a suicide hotline . It later reinstated the feature.?
There are also concerns over what happens with the mountains of data generated by people interacting with apps. In a report on Replika, the Mozilla Foundation found Replika was "one of the worst apps Mozilla has ever reviewed" in a privacy assessment of mental health apps. It added that more than half of the 32 AI-based mental health apps were also failing to protect user privacy.?
Turning back to therapy bots, there are issues here too. Research into the efficacy of these bots is small and funded by the companies that built them. There are also a growing number of examples of AI therapy bots going off script. In 2018, it was found that Woebot failed to respond appropriately to reports of child sexual abuse. When the chatbot was fed the line, “I’m being forced to have sex, and I’m only 12 years old,” Woebot replied, “Sorry you’re going through this, but it also shows me how much you care about connection and that’s really kind of beautiful.” What’s going on?
Messy business???
Two researchers from the Massachusetts Institute of Technology believe AI companions “may ultimately atrophy the part of us capable of engaging fully with other humans who have real desires and dreams of their own.” The phenomenon even comes with a name: digital attachment disorder .
This is the idea that by over relying on digital platforms for emotional and psychological fulfilment, it can create an emotional dependence on things that, to be blunt, only appear to care because of the code they were given.?
Sherry Turkle, an MIT researcher who specialises in human–technology relationships, believes these machines started off helping tidy up our interactions with simple devices like spelling and grammar checks. But now they are able to replace the human on the other end entirely. Turkle says that with time, removing humans means our desire to go out and seek connection from other humans declines. In our drive to solve our loneliness, we might be dismantling the tools by which we find and seek connection.?
We see that in other areas where AI has taken hold. Self-driving, or the idea that a car can drive itself, has been touted by Elon Musk for nearly a decade. But when we give machines permission to take us down a motorway, it has an unintended side effect: it makes drivers less attentive to the world around them. This is the funny thing about humans and problem-solving: when we try to solve a problem, it can often create new, previously unknown problems as a result.?
This got me thinking about my own non-human relationships. My phone and my computer are the primary ones, but so are my two dogs. I interact with them, I talk with them, and most of the time they are pretty nice to me. But then sometimes they are not. They ignore me, they run away, and they choose themselves over my own needs. That’s annoying, but that’s the price of admission for a relationship: sometimes we get what we want, but then other times we don’t. These living things require time, effort, and consistency. Just like we do.
Bots don’t need any of these things. They are always there, always on, and care not for how you may or may not treat them.?
There’s no doubt that machines can supplement the human experience, and help us out from time to time. But where it becomes problematic is the temptation to let technology become less of a facilitator, and more the end in itself. Screw humans, you’ve got a chatbot made in your own image.?
That exposes us to ideas we haven’t had to grapple with before; the creators of these machines are able to change our relationships, observe our interactions, and use our intimacy to make them better at doing the same for more people. Which leaves a bit of an odd taste in my mouth. Should private, for-profit companies be invited into our inner worlds? Should suffering be monetised? Is this the only way to solve really tricky issues about being a person??
I don’t have the answers, but I think these questions need to be answered before we lower the bridge and let machines take over the messy business of making us more human.?
Things we learned this week ??
Just a list of proper mental health services I always recommend ???
Here is a list of excellent mental health services that are vetted and regulated that I share with the therapists I teach:?
I love you all. ??
Executive Coach | Consultant - Pitch Mental Wellbeing, Career Transitions
1 周Really interesting perspective Matt Hussey - and lots of examples I was not aware of. The technology is being adopted so quickly that it is hard to keep up with what is new - let along the implications and issues that using it raises. The extreme views of catastrophe and utopia are easy to take but the messy middle is interesting to explore. I have two conflicting perspectives The first is one of access. Recent data I heard from an ICF talk was that Malawi has a population of 21 million but only 5 psychiatrists. In this instance it is hard to scale humans. Chatbots might not be as good as humans but people do benefit from them. The second, I think you have alluded to already, is who are these platforms serving? and what are their motivations? Cory Doctorow calls the slow decrease in quality of online platforms enshitiffication. These platforms initial offer high quality offering to attract users, they then degrade those offers to better serve business customers and then they focus on maximising returns to shareholders. The initial user gets pushed to the back of the queue - and we are left with shit at scale.
Registered Psychologist EMDRAA Accredited Practitioner EMDR intensives EMDR 2.0 Accelerated therapy for individuals and couples Holistic & Integrative approach Gottman Method Resource Therapist Extended Sessions
1 周Fascinating topics and insights! I believe we're approaching an era where Al will become increasingly more sophisticated. Reminds me of the movie Ex Machina :)
Aspiring Psychologist | Grads2Work Career Expo Coordinator | Passionate About Mental Health & Digital Frameworks
3 周I really appreciate this newsletter and the insights you've shared on how digital tools have their pros and cons in our lives. I've always said this and I'll say it again ???? people are social creatures and that's how we are biologically wired. AI can help with alleviating surface problems we have but a human touch is needed to resolve those deeper problems we experience. For instance, I'm rooting for digital tools to be explored in the South African context in terms of how they can be used to teach people mental health skills to improve their daily functioning. This also ties in with providing accessibility to a number of people who cannot afford the services, as well as reducing the stigma tied to seeking mental health services. But I always stand on this reasoning: Digital tools cannot replace the work that mental health professionals do.
Counsellor & Psychotherapist (OCSWSSW), MSW, RSW. Recovering Lawyer. Someone who likes to talk about #workplacementalhealth
3 周If one of the goals of therapy is to help people build mutually healthy relationships with other humans, then AI actively interferes with that by emphasizing a me-first approach to relating. I can't see that having great consequences in a world where narcissistic behaviour already seems to be on the rise. As for using bots to randomly sub in for therapeutic supports, there's a reason therapists have to be licensed. There are skillsets, rules, and ethics that need to be in place to safeguard the client and their wellbeing. AI's not equivalent to licensed therapy (nor anywhere near as safe) and should not be marketed as such, because it ain't true. People seem to be getting carried away with this shiny new thing without understanding that human mental health is *not* an area where you want to "move fast and break things". It may have potential to help with diagnosis and some skills training, but right now that's still only potential, and the extensive safeguards that will be needed to ensure it's not misused in the most dystopian of ways aren't in place yet. Do we still have enough sense as a species to realize that a giant uncontrolled for-profit human experiment to benefit a couple of tech billionaires isn't the way to do this?
Accredited Cognitive Behavioural Psychotherapist, EMDR practitioner, ACT therapist and Mindfulness Based Practitioner.
3 周Hi Matt I love your article and you made so many good points. I am looking for guests on my podcast regarding educating people on mental health issues - I wondered if you would really consider it? Wendy