Loading [MathJax]/extensions/tex2jax.js

Occupant Monitoring in Automated Vehicles: Classification of Situation Awareness Based on Head Movements While Cornering

Publikation: Beitrag in FachzeitschriftKonferenzaufsatz in FachzeitschriftForschungPeer-Review

Autorschaft

  • Frederik Schewe
  • Hao Cheng
  • Alexander Hafner
  • Monika Sester

Externe Organisationen

  • Technische Universität Braunschweig

Details

OriginalspracheEnglisch
Seiten (von - bis)2078-2082
Seitenumfang5
FachzeitschriftProceedings of the Human Factors and Ergonomics Society Annual Meeting
Jahrgang63
Ausgabenummer1
PublikationsstatusVeröffentlicht - 20 Nov. 2019
VeranstaltungHuman Factors and Ergonomics Society Annual Meeting 2019 - Seattle, USA / Vereinigte Staaten
Dauer: 28 Okt. 20191 Nov. 2019

Abstract

We tested whether head-movements under automated driving can be used to classify a vehicle occupant as either situation-aware or unaware. While manually cornering, an active driver’s head tilt correlates with the road angle which serves as a visual reference, whereas an inactive passenger’s head follows the g-forces. Transferred to partial/conditional automation, the question arises whether aware occupant’s head-movements are comparable to drivers and if this can be used for classification. In a driving-simulator-study (n=43, within-subject design), four scenarios were used to generate or deteriorate situation awareness (manipulation checked). Recurrent neural networks were trained with the resulting head-movements. Inference statistics were used to extract the discriminating feature, ensuring explainability. A very accurate classification was achieved and the mean side rotation-rate was identified as the most differentiating factor. Aware occupants behave more like drivers. Therefore, head-movements can be used to classify situation awareness in experimental settings but also in real driving.

ASJC Scopus Sachgebiete

Zitieren

Occupant Monitoring in Automated Vehicles: Classification of Situation Awareness Based on Head Movements While Cornering. / Schewe, Frederik; Cheng, Hao; Hafner, Alexander et al.
in: Proceedings of the Human Factors and Ergonomics Society Annual Meeting, Jahrgang 63, Nr. 1, 20.11.2019, S. 2078-2082.

Publikation: Beitrag in FachzeitschriftKonferenzaufsatz in FachzeitschriftForschungPeer-Review

Schewe, F, Cheng, H, Hafner, A, Sester, M & Vollrath, M 2019, 'Occupant Monitoring in Automated Vehicles: Classification of Situation Awareness Based on Head Movements While Cornering', Proceedings of the Human Factors and Ergonomics Society Annual Meeting, Jg. 63, Nr. 1, S. 2078-2082. https://doi.org/10.1177/1071181319631048
Schewe, F., Cheng, H., Hafner, A., Sester, M., & Vollrath, M. (2019). Occupant Monitoring in Automated Vehicles: Classification of Situation Awareness Based on Head Movements While Cornering. Proceedings of the Human Factors and Ergonomics Society Annual Meeting, 63(1), 2078-2082. https://doi.org/10.1177/1071181319631048
Schewe F, Cheng H, Hafner A, Sester M, Vollrath M. Occupant Monitoring in Automated Vehicles: Classification of Situation Awareness Based on Head Movements While Cornering. Proceedings of the Human Factors and Ergonomics Society Annual Meeting. 2019 Nov 20;63(1):2078-2082. doi: 10.1177/1071181319631048
Schewe, Frederik ; Cheng, Hao ; Hafner, Alexander et al. / Occupant Monitoring in Automated Vehicles : Classification of Situation Awareness Based on Head Movements While Cornering. in: Proceedings of the Human Factors and Ergonomics Society Annual Meeting. 2019 ; Jahrgang 63, Nr. 1. S. 2078-2082.
Download
@article{c42cb5e916e24931a5670455bfdd6896,
title = "Occupant Monitoring in Automated Vehicles: Classification of Situation Awareness Based on Head Movements While Cornering",
abstract = "We tested whether head-movements under automated driving can be used to classify a vehicle occupant as either situation-aware or unaware. While manually cornering, an active driver{\textquoteright}s head tilt correlates with the road angle which serves as a visual reference, whereas an inactive passenger{\textquoteright}s head follows the g-forces. Transferred to partial/conditional automation, the question arises whether aware occupant{\textquoteright}s head-movements are comparable to drivers and if this can be used for classification. In a driving-simulator-study (n=43, within-subject design), four scenarios were used to generate or deteriorate situation awareness (manipulation checked). Recurrent neural networks were trained with the resulting head-movements. Inference statistics were used to extract the discriminating feature, ensuring explainability. A very accurate classification was achieved and the mean side rotation-rate was identified as the most differentiating factor. Aware occupants behave more like drivers. Therefore, head-movements can be used to classify situation awareness in experimental settings but also in real driving.",
author = "Frederik Schewe and Hao Cheng and Alexander Hafner and Monika Sester and Mark Vollrath",
note = "Publisher Copyright: {\textcopyright} 2019 by Human Factors and Ergonomics Society.; Human Factors and Ergonomics Society Annual Meeting 2019 ; Conference date: 28-10-2019 Through 01-11-2019",
year = "2019",
month = nov,
day = "20",
doi = "10.1177/1071181319631048",
language = "English",
volume = "63",
pages = "2078--2082",
number = "1",

}

Download

TY - JOUR

T1 - Occupant Monitoring in Automated Vehicles

T2 - Human Factors and Ergonomics Society Annual Meeting 2019

AU - Schewe, Frederik

AU - Cheng, Hao

AU - Hafner, Alexander

AU - Sester, Monika

AU - Vollrath, Mark

N1 - Publisher Copyright: © 2019 by Human Factors and Ergonomics Society.

PY - 2019/11/20

Y1 - 2019/11/20

N2 - We tested whether head-movements under automated driving can be used to classify a vehicle occupant as either situation-aware or unaware. While manually cornering, an active driver’s head tilt correlates with the road angle which serves as a visual reference, whereas an inactive passenger’s head follows the g-forces. Transferred to partial/conditional automation, the question arises whether aware occupant’s head-movements are comparable to drivers and if this can be used for classification. In a driving-simulator-study (n=43, within-subject design), four scenarios were used to generate or deteriorate situation awareness (manipulation checked). Recurrent neural networks were trained with the resulting head-movements. Inference statistics were used to extract the discriminating feature, ensuring explainability. A very accurate classification was achieved and the mean side rotation-rate was identified as the most differentiating factor. Aware occupants behave more like drivers. Therefore, head-movements can be used to classify situation awareness in experimental settings but also in real driving.

AB - We tested whether head-movements under automated driving can be used to classify a vehicle occupant as either situation-aware or unaware. While manually cornering, an active driver’s head tilt correlates with the road angle which serves as a visual reference, whereas an inactive passenger’s head follows the g-forces. Transferred to partial/conditional automation, the question arises whether aware occupant’s head-movements are comparable to drivers and if this can be used for classification. In a driving-simulator-study (n=43, within-subject design), four scenarios were used to generate or deteriorate situation awareness (manipulation checked). Recurrent neural networks were trained with the resulting head-movements. Inference statistics were used to extract the discriminating feature, ensuring explainability. A very accurate classification was achieved and the mean side rotation-rate was identified as the most differentiating factor. Aware occupants behave more like drivers. Therefore, head-movements can be used to classify situation awareness in experimental settings but also in real driving.

UR - http://www.scopus.com/inward/record.url?scp=85099878690&partnerID=8YFLogxK

U2 - 10.1177/1071181319631048

DO - 10.1177/1071181319631048

M3 - Conference article

VL - 63

SP - 2078

EP - 2082

JO - Proceedings of the Human Factors and Ergonomics Society Annual Meeting

JF - Proceedings of the Human Factors and Ergonomics Society Annual Meeting

SN - 1541-9312

IS - 1

Y2 - 28 October 2019 through 1 November 2019

ER -

Von denselben Autoren