This project investigates applications of sensor fusion for the purposes of spacecraft navigation in noncooperative scenarios. The general focus is on vision based techniques. Specifically, four distinct scenarios are considered and analyzed. The masses of asteroids or other small bodies in the solar system may be estimated by measuring the disturbances their gravity cause on the trajectory of a nearby spacecraft performing a flyby. However, for lowmass bodies this requires the spacecraft to be in close proximity of the target which can put the spacecraft in danger. A method that allows for accurate mass estimates without endangering the flyby spacecraft is therefore investigated. The specific approach involves ejecting a number of probes prior to the encounter and tracking these from the host spacecraft as they pass by the target. Visual observations are combined with radiometric tracking to simultaneously estimate the trajectories of the target, the spacecraft, and the probes as well as the mass of the target. It is found that the mass can be extracted with better accuracy than what is achievable using conventional methods, with the best precision being obtained when the probes are fitted with internal radio beacons. When a spacecraft is operating in close proximity to a celestial body, its position may be determined using a combination of Earthbased tracking and observations of the target. In the unfortunate event that the spacecraft suddenly loses communication with the ground, it will need a fast and reliable way of keeping track of its location until normal operation can be resumed. As a potential solution for this, a method for spacecraft positioning based on observations of the horizon and terminator of the target object is investigated. By fitting observations to predictions based on various surface models and combining it with attitude information from a star tracker, it is found that fairly accurate positioning is achievable, especially when multiple cameras are employed. More importantly, the method ...


    Access

    Download


    Export, share and cite



    Title :

    Multi-sensor Data Fusion for Spacecraft Navigation



    Publication date :

    2020-01-01


    Remarks:

    Christensen , L A M 2020 , Multi-sensor Data Fusion for Spacecraft Navigation . Technical University of Denmark , Kgs. Lyngby .


    Type of media :

    Book


    Type of material :

    Electronic Resource


    Language :

    English


    Classification :

    DDC:    621




    IMPROVED AUV NAVIGATION THROUGH MULTI-SENSOR DATA FUSION

    Rigby, P. / Pizarro, O. / Williams, S. | British Library Online Contents | 2007


    Spacecraft autonomous navigation technologies based on multi-source information fusion

    Wang, Dayi / Li, Maodeng / Huang, Xiangyu et al. | TIBKAT | 2021


    Integrated Navigation System Based on MCDMA and Data Fusion for Spacecraft

    Tang, L. / Shen, G. / Beijing University of Aeronautics and Astronautics | British Library Conference Proceedings | 1994