How to build the SensorBox - Hardware Assembly (2)
Take a look how all this stuff fits into this small box.
We work on the development of a Visual AI research platform, we call the SensorBox?.
R&D (Research and Development) of a Visual AI (Artificial Intelligence) system requires the use of a flexible embedded sensor suite that allows for experimentation and that can be used in the real environment. Using available benchmark datasets can only get you so far, and may not even be representative for your particular application. Building such a sensor suite can be quite complex and arguably is not really part of developing a software application.
Our goal is to build a research platform that can be used to develop state estimation, mapping and scene understanding applications. Our sensor suite consists of stereo RGB cameras, a RGB-Depth camera, a thermal camera, an ultrasonic range finder, a GNSS (Global Navigation Satellite System) receiver, IMUs (Inertial Measurement Unit), a pressure sensor, a temperature sensor and a power sensor. Our embedded processing platform consists of an Arduino Zero microcontroller and an NVIDIA Jetson Xavier NX. Our embedded power source consists of a USB-C power bank.
As a research platform, we are looking for maximum flexibility and re-use. We leverage existing and popular eco-systems for robotics, computer vision and artificial intelligence, in particular the NVIDIA Jetson embedded processing platform, SparkFun’s microcontroller and sensors and Arducam’s suite of cameras.
领英推荐
We provide a GitHub repository with all design and source code files and detailed documentation with step-by-step instructions and YouTube videos on how to build the SensorBox?.
Here is our second video on how to assemble the SensorBox?.
aiWerkstatt? provides the SensorBox? as an open source product in the hope that it will provide researchers in Visual AI with a jump start and when used by many that it will enable collaborations and create synergies.