Embodied AI in robotics

Embodied AI in robotics


Before we start!

If you like this topic and you want to support me:


  1. Comment on the article; LinkedIn appreciates that and it will really help spread the word ??
  2. Connect with me on Linkedin ??
  3. Subscribe to TechTonic Shifts to get your daily dose of tech ??
  4. Checkout my new book: the Machine Learning Book of Knowledge ??
  5. If you have a crazy project idea and want to share it? Book 30 minutes. ??https://calendly.com/marco_van_hurne/happy-hour-hacks ??
  6. If you like my writing, and want to support my work - buy me a coffee ??


When I grew up I was hearing about robots and how they would be an essential part of our lives. I watched series like "Lost in Space", and Star Trek with the humanoid crew memer "commander Data".

Lost in Space, the original series.

In more recent times (ahum, I'm not that old), at the Efteling theme park in the Netherlands where I live, visitors are greeted at the entrance by a friendly tiny robot named "Pepper". Pepper welcomes guests and provides them with information about the park's attractions and shows. This interactive robot adds a touch of modern technology to the visitor experience. Pepper is a humanoid robot developed by SoftBank Robotics, designed to interact with humans in social settings. It has a sleek white body and expressive eyes which are displayed on a tablet screen. Pepper is equipped with sensors, cameras, microphones, and speakers that allows it to perceive its environment and even recognizes faces and voices. Best of all is that it can engage in natural conversations.

Pepper (Photo: SoftBanks)

Pepper uses advanced artificial intelligence algorithms that makes it able it to understand and respond to human speech in multiple languages. It can also detect and respond to non-verbal cues such as gestures and facial expressions, making interactions with Pepper feel more natural and intuitive.

How come this robot, designed only to guide people inside and trained in facial recognition, has managed to integrate the environment and to be trusted by the users? Maybe the cause is linked to the adaptive behaviour of the robot and its interactions with humans, showing a form of intelligence. Does this intelligence and self-awareness of the surroundings comes from its brain alone or does it come from a myriad of sensors like touch, hearing, visual, force etc. that help it create an image of the world that it inhabits.


If you are interested in following all new developments in digital, robotics, AI, IoT, AR/VR, Quantum Computing, and you don''t want to spend hours on end researching through scientific papers, subscribe to the TechTonic Shifts newsletter.

Pepper is not the only robot with a body or human features, such as its big eyes. The idea is in fact quite old. For example, in 1770, you could have played chess against a Mechanical Turk who had all the physical traits of a real human thus increasing the level of interaction.??

The Mechanical Turk Automaton

Because of our dream to build robots, we are trying to create intelligent machines equipped with a body and sensors, embracing the concept of “Embodied Intelligence“.

Embodied intelligence is based on the concept that intelligence is not housed in the brain alone but is also influenced by the body's interactions with the environment. Rather than viewing intelligence as a product of abstract thought alone, embodied intelligence also integrates sensory feedback in shaping cognitive abilities of humanoid robots. An example to make this concept a tad more concrete: It is the way that humans and animals use their bodies to navigate and interact with the world. For instance, the ability to walk, grab objects, and understand sensory information such as touch and the body's position are all part of how we learn and understand the world around us.


Subscribe to the TechTonic Shifts newsletter

Embodied Intelligence refers to “a computational approach to the design and understanding of intelligent behaviour in embodied and situated agents through the consideration of the strict coupling between the agent and its environment (situatedness), mediated by the constraints of the agent’s own body, perceptual and motor system, and brain (embodiment)”.

In robotics and artificial intelligence, embodied intelligence is increasingly being explored as a way to create human like systems. By integrating sensory information into the robots they will be able to sense and interact with their environments through physical means. This way they can learn and adapt in complex, real-world scenarios.

Embodied AI leverages AI to create intelligent systems that interact with the environment like living organisms.

The AI component in Embodied Intelligence facilitates:

  • Perception and Sensing: Processing sensory data (cameras, microphones) to understand surroundings.
  • Motor Control: Optimizing movement (locomotion, manipulation) for real-world tasks.
  • Learning and Adaptation: Continuously improving skills through interaction with the environment.
  • Social Interaction: Enabling communication with humans (speech, gestures) for natural interaction.
  • Decision-Making: Providing reasoning and planning capabilities for autonomous action.


Embodied Intelligence promotes the view that the body, the information we take from the physical world, and our individual experiences with the environment all play a central role in intelligence and learning. This view raises important questions related to the limits of artificial intelligence. To what extent can we create a robot like a human being???

The very brilliant series Westworld is a great example that can make us think about this, raising several ethical questions. Imagine that, just like in Westworld, we were able to faithfully reproduce a robot in human form: physical appearance, emotions, and feelings. Shouldn’t they be considered intelligent beings, like us, with the right to life and freedom?

Westworld (on HBO)

Throughout the series this topic is widely discussed, and on one side we have the perspective of Dolores, a robot trapped for years in an entertainment park for humans, who is beginning to become aware of herself and her condition.? On the other side we have Dr. Robert Ford, the designer of Dolores (excellent performance of Anthony Hopkins by the way) and many other robots like her, who believes in the potential of his creation and is willing to go to any extent to protect it, even if it means harming humanity. In the middle of this duo, we have characters who are against the increasing development of robots and defend limits to their evolution, while others are curious about their potential and how far they can go.??

Dr Robert Ford - the creator of the humanoid robots in Westworld (HBO, Anthony Hopkins)

The truth is that, even if we are talking about a series, this debate also takes place in the real world, although on a smaller scale, and the question can be raised to which extent this series is a fiction or a foreseeing of a near future. Embodied intelligence is a reality that keeps growing, raising a lot of important questions.?


Subscribe to the TechTonic Shifts newsletter

This article would not be complete without Optimus and Keot

I think by now everyone has seen the demo of Optimus version 2 (Bumblebee), the humanoid from Tesla. If you have not... be in awe:

Optimus is designed to be a bipedal robot with a height of 5 feet 8 inches and a weight of 125 pounds. It will have 40 degrees of freedom, allowing it to move in a wide range of ways. It will also be equipped with a variety of sensors like cameras and microphones and force sensors (cool!).

The surge that we currently see in humanoid robot development, and Optimus leading the pack are fueled by advancements in AI macro-models. In particular, breakthroughs in the underlying capabilities of these models have significantly pushed the advancement of embodied intelligence forward. This integration of embodied intelligence with humanoid robot technology has opened up a lot of new possibilities. And advancements in AI language understanding withmultimodal interaction capabilities (audio, touch, chat, etc.) and other high-level capabilities like force sensors, robots now possess smarter "brains" than before.

Enter Keppler Robotics

Tesla's Optimus recently has seen a rival entering the market. The China made humanoid called Keot Robot.

Kepler Robotics recently unveiled its Keot robot through a breathtaking demonstration. Keot isn’t just a single entity; it’s a trio of specialized robots tailored for different operational scenarios:

1.Kepler S1: Designed for outdoor tasks, this variant is built to handle the unpredictability and diversity of outdoor environments.

2. Kepler D1: This model is engineered for hazardous environments, boasting ruggedness and resilience to withstand extreme conditions.

3. Kepler K1: Another outdoor-oriented robot, focusing on versatility and adaptability.

Keppler Robotic's Keot Robot

Kepler plans to start the shipping of its Forerunner robot in the third quarter of 2024, and if you’re in another country, you can get one for about $30,000. The pricing makes it significantly more expensive than Tesla’s Optimus, which is expected to be available from $20,000.

Embodied Intelligence research at SFI and Embodied Agents

The Santa Fe Institute is renowned for its interdisciplinary approach to complex systems science. Basically that boils down to research in various fields including artificial intelligence, self organization and complex adaptive systems. These are systems composed of many interacting agents (see my previous articles on AI-agents, here, there, and over here) whose collective behavior emerges from the interactions among them. All of these areas that they are looking into will have implications for AI.

On Embodied Intelligence they are currently creating a framework:

"We are building a theoretical framework that will guide the creation of artificial agents that adjust their neural networks (brains) to feedback from their bodies and surroundings -- in essence to learn how to navigate their surroundings."

This project aims to build a theoretical framework for AI agents that learn through body-environment interaction, mimicking human experience.

The Santa Fe Institute has a different way of looking at artificial intelligence compared to the traditional approach. See, traditional AI usually relies on having pre-set behaviors programmed into it. But SFI is all about Embodied Intelligence, which says that intelligence comes from how a body interacts with its surroundings and does its thinking.

Cheap design is the idea that intelligence can emerge from the interaction between an agent's physical embodiment and its environmental context, without requiring elaborate cognitive processing

Than there's the principle called "cheap design" that SFI is very much into. It basically says that you don't need a super fancy brain to be smart – you can figure things out just by how your body moves around and deals with objects or people in the world.

SFI's team is creating a combination of information theory and self-organization to create these adaptable agents (in embodied systems, they are usually called: Embodied Agents, are you still with me?). These agents could totally change the game for AI! For instance when robots need to traverse tricky spots, and they do it like it's a walk in the park. The virtual assistants know what the robot needs before it even asks. It's like they're giving AI a whole new way of seeing and doing things.

Habitat: the training ground for Embodied Intelligence research

Researchers from Facebook AI Research, Georgia Institute of Technology, Facebook Reality Labs, Simon Fraser University, Intel Labs, and UC Berkeley have presented one of the most interesting robotic research developments, Habitat.

Habitat is a new platform for embodied AI research. Think of it as an agent's only world where they can be trained in a highly efficient photorealistic 3D simulation.

Habitat has a number of ""components":

1. Habitat SIM - a high-performance 3D simulator with configurable agents, sensors, and generic 3D dataset processing with in-built support for Matterport3D, SUNCG, and Gibson datasets.

2. Habitat API - an integrated high-level library that allows users to train and benchmark embodied agents with different approaches in diverse 3D scene datasets.

3. Habitat Challenge - targets to benchmark and advance efforts in goal-directed visual navigation.

Habitat is a paradigm shift in the field of embodied AI because it provides an environment for embodied agents to act and learn in realistic environments. The Habitat platform is open sourced, see details here.


The development of humanoid robots like Tesla Bot and Kepler’s Keot marks the beginning of a new era in robotics. With each leap in technology, albeit sensors, AI models, Embodied Agents, new frameworks, or robots with more desterity, these bots will redefine human-robot interaction. One thing is clear to me. This is only just the beginning, and its impact is very far reaching. As a tech enthusiast I am glad to be able to witness all of this.


Well, that's a wrap for today. Tomorrow, I'll have a fresh episode of TechTonic Shifts for you. If you enjoy my writing and want to support my work, feel free to buy me a coffee ??

Think a friend would enjoy this too? Share the newsletter and let them join the conversation. LinkedIn appreciates your likes by making my articles available to more readers.

Signing off - Marco



Top-rated articles:






要查看或添加评论,请登录

Marco van Hurne的更多文章

社区洞察

其他会员也浏览了