Autonomous rendezvous and docking (AR&D) is a critical need for manned spaceflight, especially in deep space where communication delays essentially leave crews on their own for critical operations like docking. Previously developed AR&D sensors have been large, heavy, power-hungry, and may still require further development (e.g. Flash LiDAR). Other approaches to vision-based navigation are not computationally efficient enough to operate quickly on slower, flight-like computers. The key technical challenge for visual odometry is to adapt it from the current terrestrial applications it was designed for to function in the harsh lighting conditions of space. This effort leveraged Draper Laboratory's considerable prior development and expertise, benefitting both parties. The algorithm Draper has created is unique from other pose estimation efforts as it has a comparatively small computational footprint (suitable for use onboard a spacecraft, unlike alternatives) and potentially offers accuracy and precision needed for docking. This presents a solution to the AR&D problem that only requires a camera, which is much smaller, lighter, and requires far less power than competing AR&D sensors. We have demonstrated the algorithm's performance and ability to process 'flight-like' imagery formats with a 'flight-like' trajectory, positioning ourselves to easily process flight data from the upcoming 'ISS Selfie' activity and then compare the algorithm's quantified performance to the simulated imagery. This will bring visual odometry beyond TRL 5, proving its readiness to be demonstrated as part of an integrated system. Once beyond TRL 5, visual odometry will be poised to be demonstrated as part of a system in an in-space demo where relative pose is critical, like Orion AR&D, ISS robotic operations, asteroid proximity operations, and more.


    Access

    Access via TIB

    Check availability in my library


    Export, share and cite



    Title :

    Visual Odometry for Autonomous Deep-Space Navigation Project


    Contributors:

    Conference:

    JTWG IRAD Meeting ; 2016 ; Houston, TX, United States


    Publication date :

    2016-11-02


    Type of media :

    Conference paper


    Type of material :

    No indication


    Language :

    English




    Visual Odometry for Autonomous Deep-Space Navigation Project

    Robinson, Shane / Pedrotty, Sam | NTRS | 2016


    Visual Odometry for Autonomous Deep-Space Navigation

    Robinson, Shane / Pedrotty, Sam | NTRS | 2016


    AUTONOMOUS NAVIGATION USING VISUAL ODOMETRY

    LIAO MIAO / LI MING / HONG SOONHAC | European Patent Office | 2017

    Free access

    Fusing Optimal Odometry Calibration and Partial Visual Odometry via A Particle Filter for Autonomous Vehicles Navigation

    S.A. Ávila-Martínez / J.C. Martínez Romo / F.J. Luna Rosas et al. | BASE | 2021

    Free access