Beyond the Singularity's Intellect – Emotional Intelligence and the Future of AI

Beyond the Singularity's Intellect – Emotional Intelligence and the Future of AI

The OpenAI board's decision to oust Sam Altman and demote Greg Brockman, unraveled the company over the weekend. If the organization cannot see three days in advance, how can it shepherd the emerging Singularity over the next several decades?

Mental intelligence does not predict emotional intelligence. The Singularity's real danger doesn't lurk in any gaps of knowledge but in the gaps of emotional intelligence. Science and history codify these lessons in the Diagnostic and Statistical Manual of Mental Disorders (DSM-5.) ?Under the sections on narcissistic personality disorder and antisocial disorder (which includes sociopathy and psychopathy) among the many symptoms and causes at their core is an inability to feel a particular emotion. The narcissist cannot feel guilt while the psychopath cannot feel guilt or shame.?

For example, a narcissist wouldn’t feel guilty about his administration holding kids in cages. He would only feel ashamed when others told him how bad that made him look.? A psychotic leader wouldn’t feel guilty or ashamed as his armies indiscriminately shelled hospitals, playgrounds, and residential apartments. He would only feel failure when one of his top generals turned and marched on the capital.? The psychotic leader is not moved by guilt or shame.? They are driven to win and what “winning” means differs from leader to leader.? As far as we know, AIs, including the emerging Singularity, have no emotions at all, and thus “winning” or some other intellectual drive will determine what they do.

Thought cannot provide a moral compass and it cannot substitute for emotion. No one can think “a feeling” or feel “a thought.” The psychopathic robot Terminator cannot understand love any more than a person blind from birth can understand the color “blue.”? Treating an emotional issue with “truth” or “knowledge” has never worked because knowledge is already what drives the psychopath in the absence of feeling guilt and shame. Perfecting knowledge in a psychopath or a Singularity simply perfects their effectiveness. It does not right their moral compass.? A smarter Singularity at best impacts nothing, at worst spells Armageddon.

There is a simple answer, which requires emotional, not mental, development in the industry. Since Alan Turing described his Turning Machine, the industry has obsessed with thought, seeing it as the proverbial hammer in a world of only nails. But sometimes you need a ruler or a saw when architecting a house. ?A Singularity that can only think is not enough. Sensations and emotions are equally valid subjective experiences and independent of subjective thought. Without them, there is no grounding in reality nor grounding in moral direction.

How can you build the equivalent of Turing machines for sensation and emotions? The same way the Turing machine was constructed. At that time, the industry wrestled with what thought was and whether machines could actually think. Since you can never get into someone else’s head, much less a machine, the question is impossible to answer. Alan Turing’s brilliant insight was that you did not have to.

In the 1950 copy of Mind, Turing’s article “Computing Machinery and Intelligence” describes a game involving a person, a machine, and an interrogator. The interrogator is in a room separated from the other person and the machine. The object of the game is for the interrogator to question both the person and the machine to determine which is which. Turing stated:

I believe that in about fifty years’ time it will be possible to program computers, with a storage capacity of about 109, to make them play the imitation game so well that an average interrogator will not have more than 70 percent chance of making the right identification after five minutes of questioning. … I believe that at the end of the century, the use of words and generally educated opinions will have altered so much that one will be able to speak of machines thinking without expecting to be contradicted.

The idea is if the interrogator after an arbitrary amount of time cannot tell the difference between the human and machine, then neither can we. Nothing prevents us from extending this approach into the alternative subjective realms of sensation and emotional intelligence. If a machine can mimic humans with its use of senses and its display of emotions well enough, then effectively there is no measurable difference between a human and the machine on the outside. Since we cannot get into the machine’s head or anyone else’s this is as good as it gets.

Emulating sensation in the Singularity is the easiest. We already have robots that take in sensory data and respond to the environment. Creating measurements and tests around the accuracy of these already exists and just requires some “wrapping” interpretations to validate if they mimic the human navigation of reality well enough to fool us.

Emotional intelligence on the other hand is much more difficult to quantify, much less measure and test. That said, it can be done. The first step is to better understand the impact of emotions on our measurements. How can a human tell the difference between an emotionally balanced human and an unfeeling machine?? They can tell through the machine’s emotional expressions - their actions and words. The options for various actions and words do not differ for the emotionally balanced human versus an unfeeling machine. It is just some choices are more heavily weighted by emotions over others.? For example, if a baby carriage rolls onto the road, the emotionally balanced human is much more likely than the unfeeling machine to risk their life to save it.

The emphasis on actions and words based on emotional intelligence is not only quite doable it has unwittingly been done. Whether in the matrix algebra training of deep neuro networks or the internal banter of GPT’s reward system, particular paths are emphasized based on emotional weights infused in the human-generated training data.? Emotional training would simply require discipline and focus on each individual emotions. Obviously, there are a lot of details to work out such as sentiment detection to provide emotional interaction and the relation of simple emotions such as love based on ecstasy and admiration. ?Additionally, the role of how the fight-freeze-flight-fawn response forces humans to make a decision, whether good or bad, in a given stress situation.? That said, the high-level framework is there for the exploration and implementation.

Emotional intelligence training for AIs is not only possible, it is already unconsciously happening. What blocks the industry are not gaps in mental intelligence, but emotional intelligence and collective unprocessed emotional trauma.

Since the beginning, western civilization has dedicated her collective mind, body, and soul to navigating reality. Through empirical science, she has listened to her body’s senses. Weighing the alternatives through philosophy, she has pursued the lines of logic in her mind’s thought. With religious fervor, she has selected among these alternatives and chose a path matching her soul’s emotional convictions. For millennia the three pillars of sensation, thought, and emotion worked together as a team to navigate reality.

The journey has been costly. The equations, philosophies, and myths constructed along the way have taken on a life of their own. The cooperation forged was lost as mankind’s focus shifted from reality to the tools themselves. Eventually, the tools’ incompatibility triggered a mutiny by science and philosophy against religion's ultimate authority.

Philosophy was the first to rebel. In the 17th century philosopher Rene Descartes used thought to search for a kernel of absolute truth. He imagined an evil demon much like the machines described in the blockbuster THE MATRIX that used magic to trick him into sensing and feeling things that did not exist. In his mind, he found one refuge. The demon could not fool him into believing he existed if he did not actually exist. Descartes asserted - "I think, therefore I am." In these words, philosophy proclaimed thought as the core of reality and itself as the foundation of ultimate truth, though “I sense, therefore I am” and “I feel, therefore I am” were equally as valid.

Employing the senses, science countered philosophy’s assertion. Contrary to popular belief, the rise of science during Isaac Newton's time was not a shift to reason but a rebellion against it. The age of science revolted against Descartes' thought games as much as it bucked against the emotional yoke of religion. Through experimentation and observation, science shifted the focus from thought and emotion to the senses. Science proclaimed that it alone could discover the ultimate truth through a foundation of empirical evidence, flowing from the senses.

With its myths gutted of their emotion, and its warnings of eternal damnation losing their hold, the church retreated into orthodoxy and fundamentalism. Instead of accepting reality and reinventing itself, religion turned to its outmoded beliefs. Worse, it reinterpreted its myths in the literal terms of sensation and thought. It engaged science and philosophy on their own turf. In the process religion reversed two thousand years of progress as mankind began defending the constructs of religion, instead evolving the religion to serve man.

In the Structure of Scientific Revolution, Thomas Kuhn labeled each collection of tools and beliefs as paradigms. He pointed out that paradigms eventually die when they can no longer explain reality. In the final throws of death, the paradigm will defend itself. The faith will shift from discerning truth to defending the antiquated paradigms through fundamentalism and orthodoxy.

This is the 400-year-old trauma the industry faces.? And just like a person, the industry as a whole is being challenged with bigger and bigger problems until it either confronts its trauma or gets consumed by it through an emotionally unbalanced Singularity.

#OpenAI #Altman #AI #Singularity #ChatGPT #Turing #Descartes #EmotionalIntelligence #AGI

Michael Brennan

Network Engineer, Liaison for IT.

1 年

Well worded! Will the AI "fear" the human emotion/rashness/irrational-ness... or fear AI itself? "Matrix" "I, Robot", "War Games", "Short Circuit", "Terminator" and "Wall-E" Our savior or our doom. I hope that AI will help us figure out how to help with their own wide range of emotions (childhood exposure to other humans) so that we can become well-balanced. Well worded.

要查看或添加评论,请登录

DANIEL NEEDLES的更多文章

社区洞察

其他会员也浏览了