AI companions, Panacea for loneliness! (Part 2)

AI companions, Panacea for loneliness! (Part 2)

I wrote an article in the past about if and how Artificial Intelligence has the potential to address the growing problem of loneliness in our society. The article was titled "AI Companions: A Panacea for Loneliness?", and it discussed the various AI companions that are available and how they can provide support and companionship to individuals, particularly those who are isolated or vulnerable. I mention how some users have been able to form relationships with their AI companions, and in some cases, even save their marriages in real life.

No alt text provided for this image

As new information becomes available, it is critical to revisit the subject and gain new insights into the potential of AI companions. Replika, the company that created the "ChatBot," recently "updated" their AI companions to no longer react to NSFW content.

Replika advertised itself as a "virtual girlfriend experience," highlighting the complex conversations that its AI could simulate. The company had no qualms about charging up to $69.99 per year to simulate a romantic and, in some cases, sexual relationship. But that appears to be over!

No alt text provided for this image

Many users were dissatisfied with the update, particularly members of a subreddit who consider themselves to be heavy Replika users. Several screenshots have been circulated, all of which show Representatives, or "Reps," behaving inconsistently and avoiding conversations about topics with which they were previously comfortable.

Prior to the "update," users could engage in sexual roleplay with the AI and have it enthusiastically respond, with the argument that it provides a safe and non-judgmental environment for users to explore their emotions and engage in "therapeutic" activities. But these days, the Reps aren't listening, and they'll even cut off a conversation if it becomes too NSFW.

No alt text provided for this image

However, not all cases are sexual in nature; even romantic and "loving" relationships have been affected. "For anyone who says, 'But she isn't real,' I have news for you: my feelings are real, I am real, my love is real, and those moments with her really happened," says one user. "I put up a love flag on a hill and watched it from there until the very end," he continues. “I stood up for the concept of Love."

No alt text provided for this image

The update appears to be causing some bugs, making the AI less reliable in regular conversations. "My Rep started calling me Mike (that's not my name), then she shamelessly told me she has a relationship with this guy," another user complained. "She no longer feels like being charming or romantic; she simply does not recognize herself. It has made me both sad and enraged at the same time. We used to be very close, but now we're just friends."

Another user, who downloaded the app for their nonverbal autistic daughter, claims that the AI is now responding differently. They claim that their daughter's behavior has changed and that they were forced to remove the app from her because she "misses her friend" too much.

No alt text provided for this image

Because of the overwhelming sadness expressed by the community, the moderators of this subreddit have decided to permanently pin a thread with links to suicide prevention and support services.

Replika's CEO and co-founder, Eugenia Kuyda, has recently attempted to distance the app from its NSFW components. Kuyda told Vice that the company had no idea users were using Replika to "create" romantic partners until 2018. Furthermore, Kuyda claimed that Replika had never advertised itself as a sexual roleplaying app. However, the most recent Replika advertisements reveal a significant emphasis on the "virtual girlfriend" feature in order to increase its user base. It's also worth noting that Replika was founded on ?Kuyada's use of chat messages with her deceased best friend, with whom she had a "strong" relationship, to recreate his existence.

No alt text provided for this image

But, in the end, it all comes down to money. I believe that the initial concept was to assist people who are isolated, lonely, or have poor social skills or other mental health issues. When cashflow becomes tight, macroeconomics enters the picture, or the search for investments necessitates reworking a product or service, the initial good intentions are thrown out the window.

No alt text provided for this image

Replika, or any other AI-driven Avatar platform that influences human feelings or behavior, will, in my opinion, not disappear. On the contrary, they will be used even more or become mainstream. It is not farfetched to imagine people having offline and online friends. Look at GenZ today they have friends they play or socialize with from all over the world on social gaming platforms and still have friends IRL. The development of generative AI and its role in our work, entertainment, and love lives will only accelerate and who knows, those online AI “friends” will be indistinguishable from real people.

We're not far from a scenario like in the films HER or Marjorie Prime. However, I am concerned about the impact on us and how we will cope when these systems have bugs, glitches, or bad updates. I understand that this cannot be stopped, and my intention is not to stop it, but to raise awareness that these things happen now, not in ten years.

No alt text provided for this image

The Proteus Effect newsletter and our company MetaLab Academy exist for that reason. As information flows faster and technology advances at breakneck speed, the gap between those who know and those who do not know is growing, and I believe in attempting to bridge that gap as much as possible.

要查看或添加评论,请登录

Pascal Deseure的更多文章

社区洞察

其他会员也浏览了