In this paper, we analyzed passengers' electroencephalography (EEG) signals to distinguish between emergency and non-emergency events. 64-Channel EEG signals of 9 participants were collected when they were watching a simulated driving video with pedestrians standing on the right side of the road and suddenly crossing the street, from the front passenger seat point of view. Event-related potential (ERP) and machine learning techniques were used to analyze and classify the signals of two road events. Results show that the responses are 454 ± 234 ms before the reaction, and the average recognition accuracy of the regularized linear discriminant analysis (RLDA) classifier reached 95.81%. We also verified our findings in a real-car automatic emergency braking (AEB) experiment. It is the first study to investigate a passenger's EEG signals of emergency situations during simulated and real-world autonomous driving experiments. Overall, the results illustrate that EEG-based human-centric assistant driving systems have the potential of being deployed in high-level autonomous vehicles to enhance the safety of passengers and overall public safety.
Decoding Passenger's EEG Signals From Encountering Emergency Road Events
08.10.2022
1040349 byte
Aufsatz (Konferenz)
Elektronische Ressource
Englisch
METHOD FOR CONTROLLING PASSENGER'S POSTURE AND DEVICE FOR CONTROLLING PASSENGER'S POSTURE
Europäisches Patentamt | 2021
|Flying from A passenger's viewpoint
Engineering Index Backfile | 1929
|