This paper proposes a method of using omnidirectional images as visual input to ORB-SLAM2 system. With it, the system can track ORB feature points in the whole 360°. Hence, compared to traditional monocular camera SLAM systems, the result will be much more robust since more ORB feature points can be extracted. To implement this scheme, the main task lies in processing the distorted omnidirectional image. In this paper, we describe an algorithm of turning the omnidirectional images into correct inputs to the ORB-SLAM2. It includes ORB feature extraction algorithm on sphere surface, area selection and image transformation. Additionally, to reduce the algorithm’s time cost, we also add some new methods like pre-rotation estimation and optimization in feature points matching.


    Access

    Check access

    Check availability in my library

    Order at Subito €


    Export, share and cite



    Title :

    Research on omnidirectional ORB-SLAM2 for mobile robots


    Contributors:
    Wang, Yule (author) / Yang, Xubo (author)


    Publication date :

    2018-08-01


    Size :

    464272 byte




    Type of media :

    Conference paper


    Type of material :

    Electronic Resource


    Language :

    English



    Dynamic Omnidirectional Vision For Mobile Robots

    Cao, Zuo L. / Oh, Sung J. / Hall, Ernest L. | SPIE | 1985


    Omnidirectional Position Location For Mobile Robots

    Ehtashami, Mohammad / Oh, Sung J. / Hall, Ernest L. | SPIE | 1985


    Application of an Improved Fast Corner Detection Algorithm in ORB-SLAM2

    Huo, Xiaochuang / Zhang, Lei / Liu, Guitao et al. | Springer Verlag | 2022


    Localization of Mobile Robots with Omnidirectional Cameras

    Tatsuya Kato / Masanobu Nagata / Hidetoshi Nakashima et al. | BASE | 2014

    Free access