Clad in a shimmery white dress and red cloak, the porcelain-skinned Jia Jia greets an audience of over a hundred people, at a press conference in Shanghai. Besides her stands her master, the Chinese Inventor Chen Xiaoping. Cameras surround her, shuttering to record her actions, her micro-gestures, even as she hangs her head coyly. “Don’t hold the cameras too close to my face. My face will look fat in the photographs,” she expresses in the anxieties of the typical woman, leaving people astonished. She is beautiful with her rosy cheeks and flowy locks, and speaks casually, with a hint of mild flirtation.
Manufactured at a cost of 50,000 dollars at the University of Science and Technology of China over a span of two years, Jia Jia is operated using cloud technology. While Jia is categorized as robotic labour, another famous cyborg, Sophia, created by Hanson labs is said to have applications in research, and healthcare industries.?
What makes these highly social robots sentient, how do they understand and even respond to human gestures, and casual conversations? How are they created to be so alike to humans to almost be mistaken as our species?
Let’s try to understand some of these factors:
- Sophia has patented skin made from an elastomer Frubber, which mimics the feel and flexibility of human skin.
- Cloud computing technologies enable robot systems to be endowed with powerful capabilities whilst reducing costs through cloud technologies. Thus, it is possible to build lightweight, low-cost, smarter robots with an intelligent "brain" in the cloud. The "brain" consists of a data centre, knowledge base, task planners, deep learning, information processing, environment models, communication support, etc
- The AI robot has 74 degrees of freedom in its mobility and articulated fingers, arms, and shoulders. Each hand has a payload capacity of 600 grams. Including self-navigation, Sophia has three distinct rolling base options.
- Information regarding the environment is gathered through Sensors and Actuators.
- The Hanson AI SDK controls Sophia's AI-based perception, NLP algorithms, open domain chat functionality, non-verbal language, low-level sensory input, and actuation controls.
- Robotic motion is guided through Inverse Kinematic Solvers, the use of Kinematical equations to control basic motion, such as hand gestures and walking. For example, to pick up an object, the robotic hand is the end effector. IK solvers map the precise line motion that the end effector should make to perform the task. Two methods of performance-Numerical Inverse and Analytical Inverse Calculation.
- Research on emotion and gesture generation puts forth the method of direct mapping of endpoints to imitate human emotions. The use of neural networks enables robots to learn emotions, store them in a wide database, and express them as per situation and gesture analysis. One such example is the Japanese system KANSEI. KANSEI is a Japanese term that means emotions, feeling, sensitivity etc. The KANSEI communication system first recognizes human emotion and maps the recognized emotion to the emotion generation space. Finally, the robot expresses its emotion synchronized with the human’s emotion in the emotion generation space. When the human changes his/her emotion, the robot also synchronizes its emotion with the human’s emotion, establishing a continuous communication between the human and the robot. It was found that the subjects became more comfortable with the robot and communicated more with the robot when there was emotional synchronization.
More on the fascinating robotic humanoid species in the upcoming posts.