The ability of autonomous localization and simultaneous mapping are crucial for autonomous robots who need to adapt to their environments. In Robotics, this method is known as SLAM (Simultaneous Localization And Mapping) and it can be implemented in several algorithms relevant to 2D or 3D environments by using different types of input sensors (laser, sonars, odometry, webcams, etc.).

Even though the Q.bo robot is equipped with two HD webcams and the possibility of having 4 ultrasound sensors (2 in the front, and 2 in the rear), our team wanted to go much further by incorporating the ASUS’s Xtion Pro Live sensor over the Q.bo’s head. This is accomplished via an adapter created for Q.bo by Thecorpora’s engineers with our own measurements. The decision to use this ASUS’s sensor instead of similar ones has been mainly due to its small size and weight, adapting accurately our Q.bo robot once the adapter was made.

This sensor emits a 3D point cloud that, along with the robot’s odometry sensor and the incorporated gyroscope, enables Q.bo to build maps, 3D modeling of objects and autonomous localization in real-time. This system can be seen as a more accurate and sophisticated visual perception compared to the stereoscopic cameras or the ultrasound sensors. However, the joint use of all systems (ultrasounds, webcams and Xtion) can generate a much more complete information than the separate use of each.

The video can be divided in the following chapters, three of which show experiments made with Q.bo and the Xtion Pro live sensor:

Chapter I:

Mounting the Xtion Pro Live sensor over a prototype mold designed by Thecorpora’s team.

Chapter II:

Real-time 3D visualization of the point cloud emitted by the Xtion Pro Live. We used the ROS visualization tool called RViz to view Q.bo’s 3D model in a desktop with a NVIDIA GeForce GTX 295 as GPU.

Updated ( 2017, January 26th )

You a have a RVIZ/Gazebo model build by our friend vincent foucault ( elpimous12 ) ready to download here

qbo robot by elpimous12

Chapter III:

SLAM (Simultaneous Localization And Mapping) in which the robot builds a 2D map of his environment using the laser scan emitted by the Xtion Pro Live sensor. The ROS package called “Gmapping” is used for the SLAM algorithm which was developed by Giorgio Grisetti, Cyrill Stachniss and Wolfram Burgard.

Chaper IV:

Autonomous navigation, reusing the built 2D map stored after using SLAM. The initial location and goal position of Q.bo robot is indicated using the RViz visualization tool. For the autonomous localization, we have used the “amcl” ROS package developed by Brian P. Gerkey. It contains an implementation of a particle filter-based localization algorithm that exploits the laser scan obtained by the Xtion Pro live and the 2D Map. For the movement instructions, we used the ROS package “move_base” developed by Eitan Marder-Eppstein. The “move_base” package contains the implementation of a global and a local 2D motion planners which use the laser scan (emitted by Xtion Pro Live) to detect close obstacles.