This paper extends a monocular visual simultaneous localization and mapping (SLAM) system to utilize two cameras with non-overlap in their respective field of views (FOVs). We achieve using it to enable autonomous navigation of a micro aerial vehicle (MAV) in unknown environments. The methodology behind this system can easily be extended to multi-camera rigs, if the onboard computation capability allows this. We analyze the iterative optimizations for pose tracking and map refinement of the SLAM system in multicamera cases. This ensures the soundness and accuracy of each optimization update. Our method is more resistant to tracking failure than conventional monocular visual SLAM systems, especially when MAVs fly in complex environments. It also brings more flexibility to configurations of multiple cameras used onboard of MAVs. We demonstrate its efficiency with both autonomous flight and manual flight of a MAV. The results are evaluated by comparisons with ground truth data provided by an external tracking system.
Visual SLAM for autonomous MAVs with dual cameras
2014-05-01
1746019 byte
Conference paper
Electronic Resource
English
Visual SLAM for autonomous navigation of MAVs
TIBKAT | 2016
|Highly Accurate SLAM for Rotary Wing MAVs
British Library Conference Proceedings | 2012
|Visual SLAM for Autonomous Ground Vehicles
Tema Archive | 2011
|