For those who are not too familiarized with robotics these three words may sound a bit strange. However, they are usually used in mobile robot platforms whose location must be known in a specific space. Qbo has two engines of 170 rpm as well as an encoder which allows to know some data of these engines. The use of the encoder in these platforms allows to know, for example, how many complete turns the wheels make in order to make the mobile platform go completely straight through a PID algorithm. This PID algorithm corrects the error between the value read by the encoder and the achievable value. Thus, if we want the robot to go completely straight we will read the values from both encoders and we will correct the obtained data to send the corrected information to the engines.
With the encode we can also estimate the position of the mobile platform in a specific area during its navigation through it. The study of this estimation is called odometry and its implementation costs are low if we compare them to other types of technology but its accuracy decreases as navigation increases either because the diameters of the wheels are different or there is a wrong wheel alignment or just because the ground is slippery.
The Slam (Simultaneous localization and mapping) is a technique which allows to build up maps within an unknown environment thanks to sensors or artificial vision.
Qbo generates Slam by using stereoscopic vision with the two high resolution cameras integrated in the robot helped by the values obtained from odometry.
In the video you can see Qbo navigating in a completely autonomous way around the office. The blue arrow corresponds to the robot’s position using only odometry and the yellow arrow represents the robot’s position already corrected by the Kalman filter. The images are those captured by both webcams and each green circle corresponds to a found mark while the green line joins two marks associated which represent a spot in the three-dimensional space. These spots are added to the status matrix of the Kalman filter to create the environment map.
Qbo uses the Slam algorithm dividing it into three parts.
The first task is to calculate the movement of the robot. To do that we use the driver for our robot that sends an Odometry message.
The second task is to detect natural features in the images and estimate their positions in a three dimensional space. The algorithm used to detect the features is the GoodFeaturesToTrackDetector function from OpenCV. Then we extract SURF descriptors of those features and match them with the BruteForceMatcher algorithm, also from OpenCV.
We also track the points matched with the sparse iterative version of the Lucas-Kanade optical flow in pyramids and avoid looking for new features in places where we are already tracking another feature.
We take the images to this node from image messages synchronized and send a PointCloud message with the position of the features, their covariance in the three coordinates, and the SURF descriptor of the features.
The third task is to implement an Extended Kalman Filter and a data association algorithm based in the Mahalanobis distance from the CloudPoint seen from the robot and the CloudPoint of the map. To do that we read the Odometry and PointCloud messages and we send also an Odometry message and a PointCloud message with the position of the robot and the features included in the map as an output.