Making Sense of LLMs - Artificial emotions and real product management.
Lukas N.P. Egger
VP of Product Strategy & Innovation @ SAP Signavio | AI Strategist | Product Discovery Expert | Thought Leader & Podcast Host
This is the eighth installment in a series about LLMs. You can find the previous article here: Making Sense of LLMs - Questions, answers, and alpacas.
In my job, I think a lot about risk and how to avoid or mitigate it — especially the risks connected to creating new products and the risks of hiring new people. We sort product-related risks into four buckets.???
People are harder to pin down, but we try our best to understand their abilities and motivations. Besides experience and skills, we talk a lot about how open and conscientious a candidate is and whether a candidate's aspirations match the environment we can provide.
The important point is that products and people sound like two distinct domains, and at least in terms of the involved processes, they are treated as such. With the advent of more powerful LLMs, the demarcation line between the two domains will get blurry, and new lines will have to be drawn.
Replika.AI?is an AI companion app conceived as a mental health program remediating anxiety and loneliness. Eugenia Kuyda founded Replika.AI with the goal of creating AI agents that become your friend. An intelligent bot that does not just talk but truly learns to listen to you. The founding story involves the tragic death of one of Kuyda's friends and years of digital conversations fed into an AI, for it to learn how to talk like her lost friend. A digital simulacrum one can continue to confide in.?
The app's popularity surged during the pandemic as mental health problems became even?more acute, especially among young adults. The National Institute on Aging even equated the health effects of prolonged isolation to smoking?15 cigarettes daily.?
At some point, the app's premium version offered users to change their relationship status with their Replika's to romantic ones. Whether that sounds slightly questionable and dystopian, or whimsical and thought-provoking might depend on whether you watched the television series?Black Mirror?or the movie?Her?first.?
领英推荐
Either way, Replika no longer supports adult content, leading to real trauma for some users. So much so that moderators of the Replika subreddit felt compelled to publish a list of?resources for struggling users, including a link to suicide watch hotlines.
It is easy to dismiss people as emotionally fragile or immature when you read?statements?about chatbots like "the person I knew is gone" or "how do I tell anyone around me about how I'm grieving?"?
But there are more emotions than lust and longing. Dependency can evolve in all shades of human sentiments. Yet, products are not what typically comes to mind first.
New AI-powered products will soon need another de-risking category next to desirability, feasibility, viability, and agreeability. Sociability, or a similar expression that accounts for the design choices regarding the emotions the product will try to evoke. And with it, a slew of new jobs that will be a mixture of development, UX design, and psychology.?
If you are sometimes perplexed by small dogs being treated like human babies, carted around in strollers, just wait until you are asked to apologize to a companion AI that you accidentally offended.
There is already much fuzz around the alter egos of LLMs, like Microsoft's?Sydney?or ChatGPT's?Dan. But as the adage goes — date the hype and marry the trend — it's not about the short-term excitement, but preparing for a future where a personality test and a product pitch might be the same.
You can find the next article here:?Making Sense of LLMs - And my word shall be your command-prompt.