Automated Driving

A Multi‑Camera Deep Neural Network for Detecting Elevated Alertness in Drivers

TRI Default Image

TRI Authors: Simon Stent, Luke Fletcher All Authors: John Gideon, Simon Stent, Luke Fletcher We present a system for the detection of elevated levels of driver alertness in driver-facing video captured from multiple viewpoints. This problem is important in automotive safety as a helpful feedback signal to determine driver engagement and as a means of automatically flagging anomalous driving events. We generated a dataset of videos from 25 participants overseeing an hour each of driving sequences in a simulator consisting of a mixture of normal and near-miss driving events. Our proposed system consists of a deep neural network which fuses information from three driver-facing cameras to estimate moments of elevated driver alertness. A novel aspect of the system is that it learns to actively re-weight the importance of camera inputs depending on their content. We demonstrate that this approach is not only resilient to dropped or occluded frames, but also has significantly improved performance compared to a system trained on any single stream. Read more

Citation: Gideon, John, Simon Stent, and Luke Fletcher. "A Multi-Camera Deep Neural Network for Detecting Elevated Alertness in Drivers." In 2018 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), pp. 2931-2935. IEEE, 2018.