Smart IoT Robot Emulator
Two weeks ago, we have provided the Braccio ++ Robot Arm Controller Module, now we extend the project with the assistant "Head" module (decision maker), "Eye" (camera) module ,and the "Face" module (interactive UI) to make it to be a "Robot".
We want to create a simple IoT Robot Emulation System which can interactive with the environment and human to simulate the smart robot's action or the smart manufacturing process. The robot system will use the classic distributed IoT hub-control type framework. Each components are linked by wireless connection. The whole system contents 4 main parts :
All the components will communicate with each other via WIFI or 4G/5G, so they can work together in different place ( or geo-location ) : Such as simulate in a smart manufacture factory, the robot arms can be located in production pipeline room, the eye can be in the warehouse, the center controller and info-broadcast nodes can be in the control room.
Demo video to show the system detect QR code and grab the box :
Introduction
The Smart Robot Emulation System includes 4 main modules: robot eye, robot face , robot head and the robot body. Each module will connect to the head module via WIFI or 5G wireless connection. The robot system diagram is shown below:
In each system, there will be one head node controls several eye, face and arm nodes. The main function of each node is shown below :
System Demo
We provide two simple demos to show how the robot emulation system works interact with human and environment.
Demo 1: Detect human and show arm control
The robot will detect whether someone is watching it. If it identifies human is watching it, it will wave its hands to attract the people's attention, then try to grab a small box next to it and handle over the box to people. (As shown in the below video )
Detail demo progress :
Demo 2 : Detect QR-code position and grab the box
The robot will detect the position of the QR-code in the video picture, if a box ( with a QR-code on it ) is detected, the robot will try to grab the box and transfer it to a user pre-set location. (As shown in the below video )
Detail demo progress :
System Design
Each module of the IoT smart robot system will follow below diagram to work with each other :
Design of Robot Eyes Module
The robot eye module contents two main parts ( Camera detection module and UDP host ) :
The Detailed Robot Eye Module's work flow is shown below :
Currently the detector provides two types of detection :
Human face detection will detect human face for the robot to interact with people :
QR code detection will detect the QR-code for robot to interact with the environment :
If we enable the detection result display flag, the program will provide a CV-image window to show the detection result in real time as shown below:
Detail program doc : Robot Eye Module ( Camera Detector ) Design document
Hardware needed : Web camera
领英推荐
Design of Robot Face Module
The Robot Face is the Interactive interface to make the robot can communicate with user to accept the user's control command and feedback the robot current state. Three main user interfaces are provided : Robot Arm controller UI, Robot face emoji UI and the Robot eye detection UI.
Robot Arm Controller UI : The interactive interface for user to remote control the robot arm through wire connection (serial comm) or wireless connection (WIFI TCP/UDP comm). It will provide the below function to user :
Robot Face Emoji UI : The interactive interface ( a emoji and a chat text field ) for robot to tell user its current state ( such as show the user whether it is free to receive new commands or it is doing some action ) . During the face detection demo, it will also show the introduction to guide the user to follow the demo steps.
Robot Eye Detection UI : The interface to show the current eye module detection result. ( High light the detected object and show the object position in the image )
The Face module UI is shown below ( detect human and react ) :
Detail program doc : Robot Face Module Design document
Hardware needed : Full HD screen
Design of Robot Head Module
The Robot head is the main controller-hub to link all other modules together : it will fetch the information from eye module, analysis the situation then ask face and body to do the related action. Each robot can only have one head, if there is more than one robots in a subnet, the robot head can decide to connect to which eye nodes, face nodes and body nodes. We also provide the ability which allow two robot to "share" a same arm ( one master and many slaves ), so for a robot arm, if the master head node is not giving it a task, other slave node can also control it to finish some task. Once the master head request its arm do some tasks, the arm controller will clear all the slaves' queued tasks in its task queue.
The Head module contents one decision maker and three types of client to communicate with other 3 module via UDP .
Detail program doc : Robot Head Module Design document
Hardware needed : N.A
Design of Robot Body Module
The Robot Body "Braccio_Plus_Robot_Arm_Controller" contents two parts :
The work is shown below :
Braccio ++ Arduino firmware
The Braccio ++ Arduino firmware will be loaded/burned in to the Arduino Nano RP2040 board on the Braccio++ robot arm's Arduino carrier, it contents 2 main parts :
Braccio ++ Controller UI
User will use Braccio ++ Controller UI to remote control the robot arm, the program will run the on user's computer with 2 parallel threads :
The user interface design detail is shown below:
Detail program doc : Robot Body Module Design document
Hardware needed : Braccio++ robot arm
Program Setup and Execution
To setup each sub system of the robot, please follow the readme.md file in the program's source code folder of each mode:
Future Work
1. May be Integrate AI in the decision maker to make it "smart" to interact with people.
2. Add more sensor in the eye module.
Project Source Code and Setup
To check the source code , please refer to link:
License type :
MIT License