Details
Originalsprache | Englisch |
---|---|
Seiten (von - bis) | 2078-2082 |
Seitenumfang | 5 |
Fachzeitschrift | Proceedings of the Human Factors and Ergonomics Society Annual Meeting |
Jahrgang | 63 |
Ausgabenummer | 1 |
Publikationsstatus | Veröffentlicht - 20 Nov. 2019 |
Veranstaltung | Human Factors and Ergonomics Society Annual Meeting 2019 - Seattle, USA / Vereinigte Staaten Dauer: 28 Okt. 2019 → 1 Nov. 2019 |
Abstract
ASJC Scopus Sachgebiete
- Sozialwissenschaften (insg.)
- Menschliche Einflussgrößen und Ergonomie
Zitieren
- Standard
- Harvard
- Apa
- Vancouver
- BibTex
- RIS
in: Proceedings of the Human Factors and Ergonomics Society Annual Meeting, Jahrgang 63, Nr. 1, 20.11.2019, S. 2078-2082.
Publikation: Beitrag in Fachzeitschrift › Konferenzaufsatz in Fachzeitschrift › Forschung › Peer-Review
}
TY - JOUR
T1 - Occupant Monitoring in Automated Vehicles
T2 - Human Factors and Ergonomics Society Annual Meeting 2019
AU - Schewe, Frederik
AU - Cheng, Hao
AU - Hafner, Alexander
AU - Sester, Monika
AU - Vollrath, Mark
N1 - Publisher Copyright: © 2019 by Human Factors and Ergonomics Society.
PY - 2019/11/20
Y1 - 2019/11/20
N2 - We tested whether head-movements under automated driving can be used to classify a vehicle occupant as either situation-aware or unaware. While manually cornering, an active driver’s head tilt correlates with the road angle which serves as a visual reference, whereas an inactive passenger’s head follows the g-forces. Transferred to partial/conditional automation, the question arises whether aware occupant’s head-movements are comparable to drivers and if this can be used for classification. In a driving-simulator-study (n=43, within-subject design), four scenarios were used to generate or deteriorate situation awareness (manipulation checked). Recurrent neural networks were trained with the resulting head-movements. Inference statistics were used to extract the discriminating feature, ensuring explainability. A very accurate classification was achieved and the mean side rotation-rate was identified as the most differentiating factor. Aware occupants behave more like drivers. Therefore, head-movements can be used to classify situation awareness in experimental settings but also in real driving.
AB - We tested whether head-movements under automated driving can be used to classify a vehicle occupant as either situation-aware or unaware. While manually cornering, an active driver’s head tilt correlates with the road angle which serves as a visual reference, whereas an inactive passenger’s head follows the g-forces. Transferred to partial/conditional automation, the question arises whether aware occupant’s head-movements are comparable to drivers and if this can be used for classification. In a driving-simulator-study (n=43, within-subject design), four scenarios were used to generate or deteriorate situation awareness (manipulation checked). Recurrent neural networks were trained with the resulting head-movements. Inference statistics were used to extract the discriminating feature, ensuring explainability. A very accurate classification was achieved and the mean side rotation-rate was identified as the most differentiating factor. Aware occupants behave more like drivers. Therefore, head-movements can be used to classify situation awareness in experimental settings but also in real driving.
UR - http://www.scopus.com/inward/record.url?scp=85099878690&partnerID=8YFLogxK
U2 - 10.1177/1071181319631048
DO - 10.1177/1071181319631048
M3 - Conference article
VL - 63
SP - 2078
EP - 2082
JO - Proceedings of the Human Factors and Ergonomics Society Annual Meeting
JF - Proceedings of the Human Factors and Ergonomics Society Annual Meeting
SN - 1541-9312
IS - 1
Y2 - 28 October 2019 through 1 November 2019
ER -