Don't Fall In Love With Your AI
Jonathan Salem Baskin
I run a tech comms consultancy and write essays, books, and musicals.
You’re probably going to break up with your smart assistant because your future life partner has just arrived.
OpenAI’s new ChatGPT comes with a lifelike voice mode that can talk as naturally and fast as a human, throw out the occasional “umms” and “ahhs” for effect, and read people’s emotions from selfies.
The company says the new tech comes with “novel risks” that could negatively impact “healthy relationships” because users get emotionally attached to their AIs. According to The Hill:
“While these instances appear benign, they signal a need for continued investigation into how these effects might manifest over longer periods of time.”
Where will that investigation take place? Your life. And, by the time there’s any conclusive evidence of benefit or harm, it’ll be too late to do anything about it.
This is cool and frightening stuff.
For those of us I/O nerds, the challenge of interacting with machines is a never-ending process of finding easier, faster, and more accurate ways to get data into devices, get it processed into something usable, and then push it out so that folks can use it.
Having spent far too many hours waiting for furniture-sized computers to batch process my punchcards, the promise of using voice interaction to break down the barriers between man and machine is thrilling. The idea that a smart device could anticipate my needs and intentions is even more amazing.
领英推荐
It’s also totally scary.
The key word in OpenAI’s promising and threatening announcement (they do it all the time, BTW) is dependence, as The Hill quotes:
“[The tech can create] both a compelling product experience and the potential for over-reliance and dependence.”
Centuries of empirical data on drug use proves that making AI better and easier to use is going to get it used more often and make it harder to stop using. There’s no need for “continued investigation.” A ChatGPT that listens and talks like your new best friend has been designed to be addictive.
Dependence isn’t a bug, it’s a feature.
About the same time OpenAI announced its new talking AI, JPMorgan Chase rolled out out a generative AI assistant to “tens of thousands of its employees” as “more than 60,000 employees” are already using it.
You can imagine that JPMorgan Chase isn’t the only company embracing the tech, or that it won’t benefit from using its most articulate versions.
Just think…an I/O that enalbes us to feed our AI friends more data and rely on them more often to do things for us until we can’t function without them…or until they have learned enough to function without us.
Falling in love with your AI may well break your heart.
[This essay appeared originally at Spiritual Telegraph]
Creator of Want Consciousness? Trusted advisor to influential leaders. Author of "Unwinding Want: Using Your Mind to Escape Your Thoughts" Learn more at: UnwindingWant.com
3 个月We live in a world where there is more and more simulacrum, and less and less life.