This article describes a method to develop a generic approach to acquire navigation capabilities for the standard platform of the IMAV indoor competition: the Parrot AR.Drone. Our development is partly based on simulation, which requires both a realistic sensor and motion model. The AR.Drone simulation model is described and validated. Furthermore, this article describes how a visual map of the indoor environment can be made, including the effect of sensor noise. This visual map consists of a texture map and a feature map. The texture map is used for human navigation and the feature map is used by the AR.Drone to localize itself. To do so, a localization method is presented. An experiment demonstrates how well the localization works for circumstances encountered during the IMAV competition.
Integrating Sensor and Motion Models to Localize an Autonomous AR.Drone
International Journal of Micro Air Vehicles ; 3 , 4 ; 183-200
01.12.2011
18 pages
Aufsatz (Zeitschrift)
Elektronische Ressource
Englisch