The objective of this project was to 3D map the fifth floor of Upson Hall with Lidar, ZED Stereo Camera, and JACKAL. We used RTAB-MAP for integrating the odometry data, depth data, and RGB data from the different sensors, and displayed it on RVIZ.
This diagram explains how the data from each component gets combined into a 3D map.
Obviously, the system can be divided into 3 big components: Lidar, Jackal, and ZED. ZED Camera is added to the system for loop closure detection based on its RGBD image, but it did not improve the system very much. The mapping is basically done with the pose and odometry published from JACKAL with the point clouds published from Lidar. The RTAB-Map software integrates the data from two (or three including ZED) into a 3D map as shown in the demonstration video.
There were many parameters from ZED Stereo Camera and RTAB-MAP launch files that had been tuned for sake of better quality in mapping process. The list of the parameters that had been tuned and the effects of the changes can be found here:
"LINK-will be added soon"
http://wiki.ros.org/rtabmap_ros/Tutorials/HandHeldMapping https://github.com/introlab/rtabmap/wiki/Tutorials http://wiki.ros.org/rtabmap_ros/Tutorials/SetupOnYourRobot