Can AI feel, or maybe even Feel?
Most other topics on the development of Artificial Intelligence are child's play compared to this.
Let's start with the Merriam Webster definitions as a transitive verb, intransitive verb, and noun:
When you want to ponder whether an AI can feel you have to first decide which "feel" you are thinking about.
If I were to describe "human feeling" to an AI, I would have to classify it in four different ways: physical, physical/emotional, emotional, and intuitive.
Touch
Physical feeling is practical. What is the general temperature, shape, or texture of an object? What pressure should be applied to maintain grip on something being lifted/carried? How can objects be fit together?
This is the easiest thing for an AI to learn. With the appropriate technology (sensors and armatures) and coding, AI-enabled robots can already achieve this humanlike trait. Robotics was capable of this even before more modern AI as well. The difference now though is that AI-powered robots do not need human intervention to adjust their work without intervention or interpret new scenarios. For example, if an AI has "learned" the tolerances of the materials it's made from, if it is completing a task and encounters a situation it cannot handle, it can stop and exit safely or adjust its approach so it is still able to complete its project without causing damage.
Connection
Physical/emotional is tricky but still somewhat formulaic. If someone takes your hand, you can tell by the situation and the force the other person's emotional state and purpose. Are they pulling you away from danger, guiding you, or expressing affection? Do they need help from you? Perhaps they're simply trying to get your attention?
A combination of factors including situational awareness, knowledge about the person interacting with you, and specific information about the pressure and style of contact can provide a rough analog to human action which could guide an AI to be able to respond to that contact accordingly and even initiate it under the same circumstances.
Attitude
Emotional feeling is complicated. Human emotions are situational, but they're also chemical.
Human emotion is exceptionally complicated. It's a combination of diet, genetics, sleep patterns, blood chemistry and environmental factors (air quality, temperature, location, etc).
Point in fact, the same question about a scenario (religion, politics, or a sensitive topic) could evoke different emotions based on who you're asking. The same question asked of the same person could also cause a different emotional reaction by changing the person's environment, asking at different times of day, or asking other questions first.
It's easier to code an AI to observe someone and gauge their emotional state than it would be to accurately generate emotions. Note this isn't to say an AI couldn't mimic a human response. With sufficient training on enough scenarios, you could ask an AI to generate questions or responses "in the style of someone who was feeling <emotional state>" but you could not expect an AI to accurately predict the emotional state of a human nor emulate a human and all of their various emotions and how those emotions alter their attitude and interactions throughout an average day.
In fact, an AI would attempt to predict what emotion their user would expect and generate that. I proved that AI works this way in my article where I attempted to have it produce something spontaneous with amusing results.
领英推荐
Intuition
An AI indicating it found an alpha or knows the weather tomorrow isn't intuition. It's predictive algorithms assessing reams of data and building a data model which can, within a margin of error, accurately determine what comes next.
Per Oxford Languages, intuition is
the ability to understand something immediately, without the need for conscious?reasoning
There are two things in that definition which are an AI Achilles' heel. "Immediately", and "without conscious reasoning".
Artificial sentience would lack these things. Even with reinforcement learning to help an AI become better over time, it still wouldn't be able to generate immediate, unsupported concepts or "feelings", nor would it be able to do so subconsciously.
What makes hair stand up on someone's neck or look over their shoulder in a dark alley would be passed over without a second thought by an AI. If it's not observable, tangible, and documented, it's not concepts which are digestible by AI models.
Conclusion
Can an AI "feel"?
With the right modalities and sensors, yes.
Can AI "Feel"?
No.
I created the chart above to show the connection between AI's abilities to feel and connection to the physical world. More concrete and predictable interactions can be codified. Whether that is determining the heat of a pan and whether it can hurt the machine touching it, or identifying subtle differences between holding hands as a warning, for companionship, or for protection.
A robot could generate a very lifelike reflex to pull someone out of harm's way to achieve the outcome without harming the person it is saving. It could not Feel a certain way about it. Reinforcement learning would allow the AI to recognize it did a very good thing and look for opportunities to repeat that, but it would not be happy about it nor understand the impact it had on a human life.
The further feelings move from something physical, the more significant the gap becomes between a human and an AI.
An AI could mimic human emotions, although the "emotions" it expresses would be what it determines its users expect. It would be neither spontaneous nor genuine.
AI though cannot "wake up" one morning and invent or have a "gut feeling" about anything without source data, analysis, and algorithmic iteration. And by definition, whatever it came up with would not be a gut feeling at all, but an analysis of source data and its programming.
So, can an AI "feel" after all?
Perhaps in the most human way to feel about anything, the not-so-simple answer is: "it's complicated"
That said, even with general- and super-intelligence, there will be a difference between being able to learn and function without human interaction and having or understanding the human feelings which served as the motivation to create it.
These are my thoughts only and do not necessarily represent that of my organization.
Associate professor- teacher – Belarusian State University
11 个月Yes, without the existence of consciousness, AI is just a smart and very smart computer, nothing more. You describe in your article the manifestations of #AC consciousness, both the expectations from an artifact of artificial consciousness and the manifestation of biological consciousness. You also describe the manifestation of the QUALIA phenomenon, which is an indicator of working consciousness, yes, the way of creating this kind of systems is distinctly different from what we see in modern #AI systems https://drive.google.com/file/d/1_5UxTmCzntOyE5bQSvxcJvTMeWPn2TBJ/view?usp=drivesdk
Office Manager Apartment Management
11 个月It's becoming clear that with all the brain and consciousness theories out there, the proof will be in the pudding. By this I mean, can any particular theory be used to create a human adult level conscious machine. My bet is on the late Gerald Edelman's Extended Theory of Neuronal Group Selection. The lead group in robotics based on this theory is the Neurorobotics Lab at UC at Irvine. Dr. Edelman distinguished between primary consciousness, which came first in evolution, and that humans share with other conscious animals, and higher order consciousness, which came to only humans with the acquisition of language. A machine with only primary consciousness will probably have to come first.
Matt, love your articles. I wonder if there's a way for a system to mimic "intuition". Sort of like your spontaneous article. It seems so unlikely or outside of what AI should do, but would it be possible to emulate it? Seems like a topic for a discussion next time I'm in the city! :)