Real-time detection of vehicle and traffic light is essential for intelligent and connected vehicles especially in urban environment. In this paper, a new vehicle and traffic light dataset is established and a real-time detection model of vehicle and traffic light based on You Look Only Once (YOLO) network is presented. A joint training method for target classification and detection is proposed by YOLOv3, aiming to balance the detection accuracy and speed. The YOLOv3 network has lower requirements on hardware devices than other target detection algorithms like Faster R-CNN. Through the experimental analysis of the measured images in urban environment, it is shown that the designed model can not only satisfy the real-time requirements, but also improve the accuracy of the detection of vehicles and traffic lights.


    Zugriff

    Zugriff prüfen

    Verfügbarkeit in meiner Bibliothek prüfen

    Bestellung bei Subito €


    Exportieren, teilen und zitieren



    Titel :

    Real-time Detection of Vehicle and Traffic Light for Intelligent and Connected Vehicles Based on YOLOv3 Network


    Beteiligte:
    Du, Luyao (Autor:in) / Chen, Wei (Autor:in) / Fu, Shuaizhi (Autor:in) / Kong, Haiyang (Autor:in) / Li, Changzhen (Autor:in) / Pei, Zhonghui (Autor:in)


    Erscheinungsdatum :

    01.07.2019


    Format / Umfang :

    597203 byte




    Medientyp :

    Aufsatz (Konferenz)


    Format :

    Elektronische Ressource


    Sprache :

    Englisch



    YOLOv3 Tiny Vehile high-speed vehicle real-time detection network and method

    CUI YANI / REN JIA / HAO QIUSHI et al. | Europäisches Patentamt | 2023

    Freier Zugriff

    YOLOv3 tiny vehicle: a new model for real-time vehicle detection

    Ma, Xiuxin / Zhuang, Huaiyu / Deng, Jiaxian et al. | SPIE | 2022

    Freier Zugriff

    Real-Time Pedestrian Detection and Tracking Based on YOLOv3

    Li, Xingyu / Hu, Jianming / Liu, Hantao et al. | TIBKAT | 2022


    Real-Time Pedestrian Detection and Tracking Based on YOLOv3

    Li, Xingyu / Hu, Jianming / Liu, Hantao et al. | ASCE | 2022