Self-driving cars are the next milestone of the automation industry. To achieve the level of autonomy expected in a self-driving car, the vehicle needs to be mounted with an assortment of sensors that can help the vehicle to perceive its 3D environment better which leads to better decision making and control of the vehicle. To complement the advantages of different sensors, sensor fusion is done, to enhance the accuracy of the overall information. In real-time implementations, uncertainty in factors that affect the vehicle's motion can lead to overshoot in parameters. To avoid that, an estimation filter is used to predict and update the fused values. This paper focuses on sensor fusion of Lidar and Camera followed by estimation using Kalman filter. It can be seen how the use of an estimation filter can significantly improve the accuracy in tracking the path of an obstacle.
Sensor Fusion of Camera and Lidar Using Kalman Filter
Algorithms for Intelligent Systems
2021-07-22
17 pages
Article/Chapter (Book)
Electronic Resource
English
Implementation of Vision and Lidar Sensor Fusion Using Kalman Filter Algorithm
BASE | 2021
|Fuzzy state noise-driven Kalman filter for sensor fusion
SAGE Publications | 2009
|British Library Conference Proceedings | 2010
|