Skip to content
naitiknakrani-eic edited this page Apr 5, 2023 · 44 revisions

VSLAM, Navigation and Sensor fusion using ADI ToF and IMU with Nvidia AGX Orin

image

About

This site is meant for robotics developers, who want to perform autonomous mobile robot navigation on their ROS2 enabled robots using the ROS2 packages, EVAL-ADTF3175D-NXZ Time of Flight sensor, ADIS16470 Inertial Measurement Unit and NVIDIA Jetson AGX Orin Developer Kit. It contains information on hardware and software requirements, setup instructions for individual components, instructions for interfacing sensors with the AGX Orin kit, and tutorials on robot navigation using RTAB-Map and sensor fusion.

Table of Contents

[This Section](https://github.com/ArrowElectronics/Robotics_AMR/wiki/Home/#Overview System) contains brief details about the system block diagram, which shows how the various parts are interconnected.

One must set up prerequisites for the AGX Orin kit, ADTF3175 module, and ADIS16470 IMU before running the entire navigation pipeline. Information on ROS2 setup and necessary dependent software packages can be found in Software requirements. To avoid any software dependency errors in the future, we advise carefully following this step. Setup for ToF provides the standard method of flashing an image for the ADTF3175. To connect the inertial measurement unit ADIS16470 via USB on AGX Orin, an additional connection is needed. In Setup for IMU, the specific setup instructions for ADIS16470 are provided.

One must be able to access IR and Depth data coming from ADTF3175 in AGX Orin to follow the instructions in Interfacing Tof With Jetson Orin Kit. To make it simple to integrate with ROS2 software stacks, this data will be presented as ROS2 topics.

Similarly, by following the instructions in INTERFACING IMU WITH JETSON ORIN KIT, ADIS16470 IMU data can be accessed in ROS2 topics.

Creating a map of its surroundings is critical for AMR to perform autonomous navigation. In our case, we used the well-known RTAB-Map algorithm to create a map from Time-of-Flight sensor data. The RTAB-Map library is available as ROS2 packages. Follow the instructions in section 6 to determine whether RTAB-Map only works with the ADTF3175 sensor.

NAVIGATION WITH TOF TUTORIAL ON AMR and SENSOR FUSION AND NAVIGATION WITH TOF TUTORIAL ON AMR both provide a comprehensive tutorial on autonomous navigation performed on EIC developed AMR. This AMR is controlled by AGX Orin, and external sensors ADTF3175 and ADIS16470 were connected.

Customers who have their own custom AMR will find a special guide in WORKING WITH NON-EIC AMR on how to integrate their AMR with the system mentioned in this guide. This section also discusses software changes so that this system can be customized to their needs.

Overview System

Figure 1 shows a high-level system block diagram of an AMR navigation system based on ToF. The entire software stack is designed with the Robot Operating System (ROS). In our AMR, the AGX Orin kit served as the core processor, on which all the ROS2 Humble nodes through docker ran, and the AMR's motion was controlled. As external sensors, ADTF3175 and ADIS16470 were connected to AGX Orin. The ADTF3175 is equipped with an I.MX8 embedded board that includes sensor driver and ROS node packages. It is USB-connected to the AGX Orin and publishes all topics in ROS1 format.

Figure 1: AMR Navigation system block diagram

Because the system was developed in ROS2, we used the ros1 bridge package inside AGX Orin to port ToF's ROS1 topics to ROS2. Other ROS2 Humble nodes that are used for navigation include the RTAB-Map node, the ROBOT LOCALIZATION node, the NAV2 node, and the IMU Driver node are running inside ROS2 humble docker container as shown in above figure 1.

In EIC developed AMR, the lower-level motor control and wheel odometry was handled using micro-ROS agent. On Jetson AGX Orin, there was a micro-ROS agent running which communicates with a microcontroller over the serial interface. Furthermore, microcontroller sends the control signal coming from Jetson to the motor driver and gives wheel encoder feedback to Jetson. This setup is applicable to only EIC developed AMR. For any custom AMR the setup for handling lower-level motor control and wheel encoder feedback mechanism might differ. For the detail one can refer blog: https://www.einfochips.com/gtc2023/control-your-robot-using-micro-ros-and-nvidia-jetson-kit/

Please note that RTAB-Map takes external odometry as input to utilize its SLAM (Simultaneous Localization and Mapping) algorithm. In EIC developed AMR, we used wheel odometry as an external odometry source. We used sensor fusion to improve the odometry by combining IMU data with wheel odometry using ROBOT_LOCALIZATION (EKF) package.

Clone this wiki locally