In various examples, sensor fusion for visual-inertial odometry in autonomous and semi-autonomous systems and applications is described herein. Systems and methods are disclosed that split processing into at least two components. For example, the first component may be configured to process incoming frames, execute one or more perspective-n-point techniques to determine states of a machine, update states associated with one or more inertial measurement unit sensors of the machine, and add new frames to a map. The second component may be configured to adjust states (e.g., poses) associated with the machine using one or more sparse bundle adjustment techniques, adjust points within an environment, and adjust IMU-related parameters using a history of camera states. In some examples, the PnP technique and/or the SBA technique may be selected based on states associated with the IMU sensor(s).
SENSOR FUSION FOR VISUAL-INERTIAL ODOMETRY IN AUTONOMOUS SYSTEMS AND APPLICATIONS
05.12.2024
Patent
Elektronische Ressource
Englisch
IPC: | G06V / B60W CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION , Gemeinsame Steuerung oder Regelung von Fahrzeug-Unteraggregaten verschiedenen Typs oder verschiedener Funktion / G01C Messen von Entfernungen, Höhen, Neigungen oder Richtungen , MEASURING DISTANCES, LEVELS OR BEARINGS |
Uncertainty-Aware Attention Guided Sensor Fusion For Monocular Visual Inertial Odometry
Deutsches Zentrum für Luft- und Raumfahrt (DLR) | 2020
|Stereo Visual Inertial Odometry for Unmanned Aerial Vehicle Autonomous Flight
Springer Verlag | 2019
|Stereo Visual Inertial Odometry for Unmanned Aerial Vehicle Autonomous Flight
British Library Conference Proceedings | 2020
|