A lane-detection system is an important component of many intelligent transportation systems. We present a robust lane-detection-and-tracking algorithm to deal with challenging scenarios such as a lane curvature, worn lane markings, lane changes, and emerging, ending, merging, and splitting lanes. We first present a comparative study to find a good real-time lane-marking classifier. Once detection is done, the lane markings are grouped into lane-boundary hypotheses. We group left and right lane boundaries separately to effectively handle merging and splitting lanes. A fast and robust algorithm, based on random-sample consensus and particle filtering, is proposed to generate a large number of hypotheses in real time. The generated hypotheses are evaluated and grouped based on a probabilistic framework. The suggested framework effectively combines a likelihood-based object-recognition algorithm with a Markov-style process (tracking) and can also be applied to general-part-based object-tracking problems. An experimental result on local streets and highways shows that the suggested algorithm is very reliable.


    Access

    Check access

    Check availability in my library

    Order at Subito €


    Export, share and cite



    Title :

    Robust Lane Detection and Tracking in Challenging Scenarios


    Contributors:
    ZuWhan Kim, (author)


    Publication date :

    2008-03-01


    Size :

    791169 byte




    Type of media :

    Article (Journal)


    Type of material :

    Electronic Resource


    Language :

    English





    Robust Lane Detection and Tracking in Challenging Scenarios

    Kim, Z. / Institute of Electrical and Electronics Engineers | British Library Conference Proceedings | 2008


    A Robust Approach for Lane Detection in Challenging Illumination Scenarios

    Manoharan, Kodeeswari / Daniel, Philemon | IEEE | 2018


    Vision-Based Robust Lane Detection and Tracking under Different Challenging Environmental Conditions

    Sultana, Samia / Ahmed, Boshir / Paul, Manoranjan et al. | ArXiv | 2022

    Free access