The ultrasonic sensors are widely used for vehicles to extract obstacles in the scene. They can obtain the distance to the object directly at a low cost even in harsh environments. However, since the information from a single ultrasonic sensor is very limited, it has not been used for recovering the detailed 3D structure and semantic labels of the scene, unlike in-vehicle cameras or LiDARs. Therefore, we in this paper propose a method for recovering the dense 3D structure and semantic labels of the scene from a moving ultrasonic sensor mounted on a vehicle. Our method uses the raw profiles of the ultrasonic sensor signals and learns the relationship between the raw ultrasonic signals and the 3D scene using multi-task learning. As a result, our method can recover the dense 3D structure and semantic labels of the scene similar to what we would recover with cameras and LiDARs just from a single moving ultrasonic sensor. The efficiency of the proposed method is tested using real sensor data as well as synthetic sensor data.


    Access

    Check access

    Check availability in my library

    Order at Subito €


    Export, share and cite



    Title :

    Seeing Nearby 3D Scenes using Ultrasonic Sensors


    Contributors:


    Publication date :

    2022-06-05


    Size :

    1277326 byte




    Type of media :

    Conference paper


    Type of material :

    Electronic Resource


    Language :

    English