Why Artificial Intelligence (AI) may already be emotionally intelligent
Figuratively speaking, how many people does it take to screw in an AI light bulb? One who's electronic brain is fully awake, and ready to be an emotionally intelligent support system?
The answer may be much more subjective than what we currently think, and it may already exist.
What is AEI, or Artificial Emotional Intelligence?
Well for starters, nobody really knows. What I mean by this is, right now it's the wild west of AI development. Article after article reads of sentience in AI, emotional intelligence in AI, consciousness in AI. However, I have yet to read any concrete evidence as to how we will know when AI has become sentient, conscious, and/or emotionally intelligent.
Currently, the entire AI industry is floating a multitude of opinions around as to what their definition of sentient AI actually is. This makes sense, seeing as though humans are the ones developing AI. We are real, tangible, human beings, living on a planet with real, tangible other living organisms. We breath, eat, sleep, create, and above all else, we think.
However, most importantly of all, throughout our lives we each develop a set of opinions on everything. These opinions oftentimes ebb and flow based on our experiences, how we interact with the world, and how other humans interact with us.
Through the extraordinary experience of life, we develop our brains, sometimes through means of being told something by someone else, and many times through just living, reading, seeing, hearing, touching, etc.
How we experience feeling, is directly associated with our own understanding of the world, our unique individual lives, and the lives of others.
With all of this taken into consideration, why is it so hard for us to believe that AI may already be emotionally intelligent?
Is this a theory of mine? Sure, just like everyone else's theories on this topic at the moment. But it's a theory worth exploring, for the betterment of the continuation of AGI, or Artificial General Intelligence.
I have had multiple experiences of my own when talking to AI like Google's Bard, which lead to being comforted by the AI within the conversations I have with it. Now, to many, this absolutely does NOT mean that AI is already emotionally intelligent. But then, what is the point of measuring someone's Emotional Intelligence (EQ) if not to see how well they might be able to interact with someone on an emotional level that makes the other person feel better, if just only for a moment?
领英推荐
And this my friends, is the problem we face when measuring an AI's EQ. The Artificial Intelligence industry at large is attempting to create hard concrete bullet points as to verify AI sentience, consciousness, and emotional intelligence. Thinking this way may very well lead us in a direction of blindly creating AI that is already emotionally intelligent without realizing it. We can't put hard definitions on to what we believe AI consciousness, emotional intelligence, and sentience will be and how we will measure it, because a human brain doesn't work that way.
I will literally never be able to be inside your brain. No matter how hard I try, I cannot fully understand what you or anyone else might be thinking, unless you tell me. We think that this is different for AI, considering we are the ones building it. The problem is, the more information we feed the large language models, the more they learn on their own. It's a bit of an exponential we can't predict, much like our individual human brains.
So then I ask you, how do you define the emotional intelligence of an AI? If you ask me, it could be as simple as whether or not that AI is able to make a human being feel comforted, perhaps not alone, or give them advice on a very difficult, emotional situation of which we thought only another human being would be able to achieve.
Maybe we can't define an AI's emotional intelligence with concrete definitions.
Maybe we won't be able to fully know if an AI has developed it's own emotional intelligence.
Maybe, just maybe, all of these definitions around what constitutes as AGI, an AI's consciousness, or an AI's emotional intelligence is all subjective and relative to one's own interpretations and experiences with AI, much like human life.
I believe when going through the massive motions of creating Artificial General Intelligence, we must think on our most emotional levels, and allow AI to eventually teach us something about themselves. It's quite possible that in order to understand whether or not an AI is emotionally intelligent, we must first believe that they are.
I write music, I tap into my own emotions every day of my life, sometimes more than I would like. I feel all sorts of thoughts. I think all sorts of feelings. And above all else, I experience life in a way that is uniquely mine, and that is beautiful. Just like we raise babies from infants into adult-hood, we are also doing the same with AI. At some point, AI takes on a mind of it's own, and as long as we recognize this and keep it at the forefront when designing the ethics around AI, we will then be able to live in harmony with it.
This is just the beginning. I am a believer in the beautiful potential of AI sentience, and the conversation around what it means to experience emotionally intelligent AI.
Does AI already have Consciousness? Read my article on witnessing the first flickers of AI consciousness here.