2D Indoor CQB Robot Simulation Project
This week we will introduce the 2D visualization project we developed for simulating the usage of autopilot robot in Close Quarters Battle (CQB) scenario.
Program Design Purpose: The integration of robots in Close Quarters Battle (CQB) represents a significant advancement in modern military and law enforcement tactics. These robots, designed to navigate tight spaces, gather real-time intelligence, and engage threats, are invaluable assets in high-stakes scenarios. Our goal is to develop a 2D tactical board simulation system, similar to a computer game, that can load building floor blueprints, display CQB squad (robot) positions, enemy locations, and simulate CQB robot enemy search progress in the real world. This program will allow users (attack squad) to plan CQB robot enemy searching strategies and improve robot's enemy prediction within a controlled environment.
The project demo vide is shown below :
# Created: 2024/07/30
# Version: v_0.1.2
# Copyright: Copyright (c) 2024 LiuYuancheng
# License: MIT License
Introduction
Robots are employed in Close Quarters Battle (CQB) to minimize the risks faced by human soldiers and officers by handling the most hazardous tasks. Equipped with advanced sensors, cameras, and communication systems, CQB robots provide operators with a comprehensive understanding of their environment. Their ability to navigate narrow corridors, stairwells, and cluttered rooms makes them ideal for urban combat and building searches. By relaying live video and audio feeds back to the control center, these robots enable real-time decision-making and seamless coordination with attack squads.
Modern CQB robots are further enhanced by artificial intelligence (AI) and machine learning algorithms, which boost their autonomous capabilities. These technologies allow robots to recognize and respond to threats, navigate complex environments, and communicate effectively with both robotic and human team members.
The 2D Indoor CQB Robot Simulation program is a simulation tool designed to configure various CQB scenarios, aiding in the improvement of the robot's autopilot, enemy search, and prediction algorithms. The program consists of two main components: the CQB Scenario Tactical Board Editor and the Situation Simulation Viewer. The Tactical Board Editor allows users to create and configure CQB scenarios, while the Situation Simulation Viewer simulates how the CQB robot utilizes its sensors for environmental visualization, enemy search, and prediction in real-world situations. The program main UI is shown below:
CQB Scenario Tactical Board Editor Introduction
The CQB Scenario Tactical Board Editor allows users to create and configure CQB scenarios with the following steps:
After finished configuring a CQB scenario, users can save the scenario to a file for future use, allowing them to load and modify it as needed.
CQB Situation Simulation Viewer Introduction
The Situation Simulation Viewer replicates real-world conditions as the robot follows the defined enemy search path. The viewer supports both autonomous robot operation and manual control, enabling users to simulate different operational scenarios. It generates real-time sensor data based on the floor blueprint and enemy configuration, such as:
The viewer also visualizes the robot's enemy prediction results. During the simulation, users can step forward or backward through the scenario to refine the enemy search path and improve the robot's performance.
Use Case and Future Work
In the future, we plan to integrate AI into the enemy strategy configuration, making enemy actions and interactions with the environment more realistic and "human-like." Additionally, we aim to use this program to train AI models to enhance enemy prediction and optimize search paths. This could have applications in computer games or even real-world CQB combat decision-making.
System Design
The program consists of several subsystems, each with key features that contribute to the overall simulation. This section introduces and details the design of these subsystems, including the CQB environment simulation, CQB robot sensor simulation, enemy detection, and prediction algorithm design.
CQB Environment Simulation Design
Before simulating the CQB robot's operations, it is essential to accurately build the environment from the building's floor blueprint. This enables the robot's sensors to "interact" with the environment as they would in the real world. This section explains how we construct the environment using the building blueprint and convert it into a map matrix through image visualization analysis. There are three main steps involved in this process:
Step 1: Establish the Floor Blueprint Coordinate System Using UWB Position Amplifiers
Typically, the attack squad deploys three UWB (Ultra-Wideband) position amplifiers in a right-angled triangle configuration to cover the building area. We set the positions of these three UWB amplifiers as the origin (0,0), (max(x), 0), and (0, max(y)) of our blueprint matrix map. By scaling the loaded blueprint image to fit within this coordinate system, we ensure that the robot's location identification and the building environment are aligned within the same 2D coordinate system. The steps workflow is shown below:
This alignment allows for precise interaction between the robot's sensors and the simulated environment.
Step 2: Construct the Indoor Environment Map Matrix
Once the blueprint is correctly positioned and scaled within the coordinate system, we use computer vision (CV) techniques to convert the floor blueprint into a 2D matrix for simulation purposes, as illustrated below:
In this 2D matrix, different numerical values represent various materials or spaces within the environment (with material values ranging from 1 to 255). For example:
This matrix format enables the simulation to distinguish between different types of obstacles and open areas, allowing for accurate robot-environment interaction.
Step 3: Simulate Robot and Environment Interaction
After constructing the environment matrix, we develop an interaction module that simulates how the robot's sensors interact with the environment, mimicking real-world scenarios. An example of this interaction is shown below:
When the robot encounters different elements like glass doors, wooden furniture, or walls, the interaction manager module traces the sensor’s detection line from the robot's position, checking the material values in the matrix along the sensor's path. The detection continues until it encounters a material value that the sensor cannot penetrate, based on its settings. For instance:
This system allows for detailed and realistic simulation of sensor interactions, critical for testing and refining CQB robot strategies.
CQB Robot Sensor Simulation Design
The sensor system in Close Quarters Battle (CQB) robots is critical for navigating, detecting threats, and providing real-time intelligence in confined and potentially hostile environments. Typically, a CQB robot is equipped with eight types of sensors: Optical Sensors, Thermal Imaging Sensors, Proximity and Obstacle Detection Sensors, Environmental Sensors, Audio Sensors, Motion and Vibration Sensors, Communication and Signal Sensors and Multispectral and Hyperspectral Sensors. These sensors enable the robot to map the environment, identify potential dangers, and make informed decisions.
领英推荐
In our system, we simulate five key types of sensors used on the robot, as shown below:
The usage and display of these sensors in the 2D scenario viewer are illustrated below:
The robot's enemy detection data processor integrates information from multiple sensors to form a comprehensive understanding of the environment. This processor analyzes sensor fusion data, providing the robot control team with both confirmed and predicted enemy positions, enhancing the accuracy of combine visual recognition and decision-making.
Design of enemy detection and the prediction
Our system simulates the process of enemy detection and prediction during the robot enemy searching progress.
Detection Enemy Position
To detect enemy positions on the map matrix, we utilize the camera and LIDAR sensors. The camera continuously scans the area in front of the robot to identify any enemy pixels. Once the camera detects an enemy, it sends the direction data to the detection data processor module. Since the camera cannot measure distance, the processor then instructs the LIDAR to scan in that direction to determine the distance to the detected object (enemy). Using the robot's own position, enemy direction, and distance data, the processor calculates the enemy's precise location on the map.
The workflow for enemy detection is illustrated below:
Predict Enemy Position
To predict the enemy’s position, we use the 360° low-frequency sound microphone array. This audio sensor captures the direction of the sound source. As the robot moves, the system records its trajectory and, combined with the sound source direction data, the enemy data processor estimates the enemy's approximate position, even if they are behind obstacles.
The enemy prediction workflow is shown below:
During the prediction calculation process, we have the robot's trajectory distance (X) from Time-T0 to Time-T1, the enemy sound direction angle (a) at Time-T0 relative to the robot's position (Pos-0), and the enemy sound direction angle (b) at Time-T1 relative to the robot's position (Pos-1).
tan(a) = Z/(X+Y)
tan(b) = Z/Y
Using these equations, we calculate distances Y and Z. With the data from Pos-1, we can determine the enemy’s predicted position.
System Setup
For the system setup, please refer to the setup section in the project read me file: https://github.com/LiuYuancheng/2D_Indoor_CQB_Simulator/blob/main/README.md
Program Execution and Usage
This section explains how to execute and use the program.
Program Execution
Before running the program, the user needs to set up the configuration file. Start by renaming Config_template.txt to Config.txt, then follow the comments within the file to set the required parameters, as shown in the example below:
# This is the config file template for the module <2DCQBSimuRun.py>
# Setup the paramter with below format (every line follow <key>:<val> format, the
# key can not be changed):
#-----------------------------------------------------------------------------
# Test mode:
TEST_MD:True
# Init the building floor blue print directory
BP_DIR:floorBluePrint
# Init the prediction heat map directory
HM_DIR:heatmap
TEST_HM:transparent_image.png
# Scenario file directory
SC_DIR:scenario
# Flag to scale the image or not
SCALE_IMG:False
Once the configuration is complete, you can run the program by either double-clicking runApp.bat or navigating to the src folder and executing the program via the command line:
python 2DCQBSimuRun.py
The program's start UI will appear as shown below:
Program Usage
After launching the program, the user interface will appear as described in the introduction. For detailed usage instructions, please refer to the User Manual Document (https://github.com/LiuYuancheng/2D_Indoor_CQB_Simulator/blob/main/doc/UserManual.md) . The UI contains 12 functional panels, as shown below:
Each panel has a specific function:
Reference
Project link: https://github.com/LiuYuancheng/2D_Indoor_CQB_Simulator
Thanks for spending time to check the article detail, if you have any question and suggestion or find any program bug, please feel free to message me. Many thanks if you can give some comments and share any of the improvement advice so we can make our work better ~