This paper presents a method for finding and tracking road lanes. The method extracts and tracks lane boundaries for vision-guided vehicle navigation by combining the Hough transform and the "active line model (ALM)". The Hough transform can extract vanishing points of the road, which can be used as a good estimation of the vehicle heading. For the curved road, however, the estimation may be too crude to be used for navigation. Therefore, the Hough transform is used to obtain an initial position estimation of the lane boundaries on the road. The line snake-ALM-then improves the initial approximation to an accurate configuration of the lane boundaries. Once the line snake is initialized in the first image, it tracks the road lanes using the external and internal forces computed from images and a proposed boundary refinement technique. Specifically, an image region is divided into a few subregions along the vertical direction. The Hough transform is then performed for each sub-region and candidate lines of road lanes in each sub-region are extracted. Among candidate fines, a most prominent fine is found by the ALM that minimizes a defined snake energy. The external energy of ALM is a normalized sum of image gradients along the line. The internal deformation energy ensures the continuity of two neighboring lines by using the angle-difference between two adjacent lines and the distance between the two lines. A search method based on the dynamic programming reduces the computational cost. The proposed method gives a unified framework for detecting, refining and tracking the road lane. Experimental results using images of a real road scene are presented.
Finding and tracking road lanes using "line-snakes"
1996-01-01
1209563 byte
Conference paper
Electronic Resource
English
Finding and Tracking Road Lanes Using "Line-Snakes"
British Library Conference Proceedings | 1996
|AN ADVANCED ROAD-LANES FINDING SCHEME FOR SELF-DRIVING CARS
TIBKAT | 2020
|Finding multiple lanes in urban road networks with vision and lidar
British Library Online Contents | 2009
|