Details
Original language | English |
---|---|
Pages (from-to) | 2078-2082 |
Number of pages | 5 |
Journal | Proceedings of the Human Factors and Ergonomics Society Annual Meeting |
Volume | 63 |
Issue number | 1 |
Publication status | Published - 20 Nov 2019 |
Event | Human Factors and Ergonomics Society Annual Meeting 2019 - Seattle, United States Duration: 28 Oct 2019 → 1 Nov 2019 |
Abstract
ASJC Scopus subject areas
- Social Sciences(all)
- Human Factors and Ergonomics
Cite this
- Standard
- Harvard
- Apa
- Vancouver
- BibTeX
- RIS
In: Proceedings of the Human Factors and Ergonomics Society Annual Meeting, Vol. 63, No. 1, 20.11.2019, p. 2078-2082.
Research output: Contribution to journal › Conference article › Research › peer review
}
TY - JOUR
T1 - Occupant Monitoring in Automated Vehicles
T2 - Human Factors and Ergonomics Society Annual Meeting 2019
AU - Schewe, Frederik
AU - Cheng, Hao
AU - Hafner, Alexander
AU - Sester, Monika
AU - Vollrath, Mark
N1 - Publisher Copyright: © 2019 by Human Factors and Ergonomics Society.
PY - 2019/11/20
Y1 - 2019/11/20
N2 - We tested whether head-movements under automated driving can be used to classify a vehicle occupant as either situation-aware or unaware. While manually cornering, an active driver’s head tilt correlates with the road angle which serves as a visual reference, whereas an inactive passenger’s head follows the g-forces. Transferred to partial/conditional automation, the question arises whether aware occupant’s head-movements are comparable to drivers and if this can be used for classification. In a driving-simulator-study (n=43, within-subject design), four scenarios were used to generate or deteriorate situation awareness (manipulation checked). Recurrent neural networks were trained with the resulting head-movements. Inference statistics were used to extract the discriminating feature, ensuring explainability. A very accurate classification was achieved and the mean side rotation-rate was identified as the most differentiating factor. Aware occupants behave more like drivers. Therefore, head-movements can be used to classify situation awareness in experimental settings but also in real driving.
AB - We tested whether head-movements under automated driving can be used to classify a vehicle occupant as either situation-aware or unaware. While manually cornering, an active driver’s head tilt correlates with the road angle which serves as a visual reference, whereas an inactive passenger’s head follows the g-forces. Transferred to partial/conditional automation, the question arises whether aware occupant’s head-movements are comparable to drivers and if this can be used for classification. In a driving-simulator-study (n=43, within-subject design), four scenarios were used to generate or deteriorate situation awareness (manipulation checked). Recurrent neural networks were trained with the resulting head-movements. Inference statistics were used to extract the discriminating feature, ensuring explainability. A very accurate classification was achieved and the mean side rotation-rate was identified as the most differentiating factor. Aware occupants behave more like drivers. Therefore, head-movements can be used to classify situation awareness in experimental settings but also in real driving.
UR - http://www.scopus.com/inward/record.url?scp=85099878690&partnerID=8YFLogxK
U2 - 10.1177/1071181319631048
DO - 10.1177/1071181319631048
M3 - Conference article
VL - 63
SP - 2078
EP - 2082
JO - Proceedings of the Human Factors and Ergonomics Society Annual Meeting
JF - Proceedings of the Human Factors and Ergonomics Society Annual Meeting
SN - 1541-9312
IS - 1
Y2 - 28 October 2019 through 1 November 2019
ER -