Besides interacting correctly with other vehicles, automated vehicles should also be able to react in a safe manner to vulnerable road users like pedestrians or cyclists. For a safe interaction between pedestrians and automated vehicles, the vehicle must be able to interpret the pedestrian's behavior. Common environment models do not contain information like body poses used to understand the pedestrian's intent. In this work, we propose an environment model that includes the position of the pedestrians as well as their pose information. We only use images from a monocular camera and the vehicle's localization data as input to our pedestrian environment model. We extract the skeletal information with a neural network human pose estimator from the image. Furthermore, we track the skeletons with a simple tracking algorithm based on the Hungarian algorithm and an ego-motion compensation. To obtain the 3D information of the position, we aggregate the data from consecutive frames in conjunction with the vehicle position. We demonstrate our pedestrian environment model on data generated with the CARLA simulator and the nuScenes dataset. Overall, we reach a relative position error of around 16% on both datasets.


    Access

    Check access

    Check availability in my library

    Order at Subito €


    Export, share and cite



    Title :

    Pedestrian Environment Model for Automated Driving




    Publication date :

    2023-09-24


    Size :

    501705 byte





    Type of media :

    Conference paper


    Type of material :

    Electronic Resource


    Language :

    English



    Pedestrian Environment Model for Automated Driving

    Holzbock, Adrian / Tsaregorodtsev, Alexander / Belagiannis, Vasileios | ArXiv | 2023

    Free access

    Analysing pedestrian-vehicle interaction to derive implications for automated driving

    Ackermann, Claudia / Technische Universität Chemnitz | TIBKAT | 2019

    Free access

    A pedestrian movement model for 3D visualization in a driving simulation environment

    Neubauer, Maximilian / Ruddeck, Géraldine / Schrab, Karl et al. | DataCite | 2021