One of the biggest issues in the majority of big cities is traffic. In order to decide on traffic control techniques and management, it is crucial to classify the current traffic circumstances. Because so much time is lost in traffic, it has a bad effect on society and has to be controlled. By classifying, we can determine which lane has traffic, from which we may further investigate the causes of traffic and make the necessary decisions to optimize performance. A good source for traffic analysis is the video of traffic data. In this study, road traffic is monitored using a convolution neural network from video surveillance data. When compared to other monitoring techniques, the convolution neural network requires the least amount of preprocessing. Convolutional Neural Networks (CNNs) require minimal preprocessing compared to traditional computer vision methods. They avoid labour-intensive steps such as manual feature engineering, where human experts design features from images. CNNs autonomously learn relevant features from raw pixel data, saving time and enhancing adaptability. They also eliminate the need for colour space conversion, as they can handle multi-channel input, preserving colour information. Additionally, CNNs can work with a variety of pixel value distributions without extensive normalization and learn to perform image segmentation without thresholding. This streamlines the workflow, making CNNs more efficient and effective in tasks like object detection, image classification, and segmentation. This model can analyze live streaming video, classify each frame, and rate the amount of traffic in each lane, both of which are useful for traffic monitoring.
Real-Time Vehicular Traffic Flow Monitoring and Classification of Vehicles for Indian Road Scenario
07.12.2023
680817 byte
Aufsatz (Konferenz)
Elektronische Ressource
Englisch
Indian SUMO traffic scenario-based misbehaviour detection dataset for connected vehicles
DOAJ | 2025
|Lateral Placement of Vehicles Under Mixed Traffic in Indian Urban Scenario
Springer Verlag | 2022
|