Loading [MathJax]/extensions/tex2jax.js

Occupant Monitoring in Automated Vehicles: Classification of Situation Awareness Based on Head Movements While Cornering

Research output: Contribution to journalConference articleResearchpeer review

Authors

  • Frederik Schewe
  • Hao Cheng
  • Alexander Hafner
  • Monika Sester

External Research Organisations

  • Technische Universität Braunschweig

Details

Original languageEnglish
Pages (from-to)2078-2082
Number of pages5
JournalProceedings of the Human Factors and Ergonomics Society Annual Meeting
Volume63
Issue number1
Publication statusPublished - 20 Nov 2019
EventHuman Factors and Ergonomics Society Annual Meeting 2019 - Seattle, United States
Duration: 28 Oct 20191 Nov 2019

Abstract

We tested whether head-movements under automated driving can be used to classify a vehicle occupant as either situation-aware or unaware. While manually cornering, an active driver’s head tilt correlates with the road angle which serves as a visual reference, whereas an inactive passenger’s head follows the g-forces. Transferred to partial/conditional automation, the question arises whether aware occupant’s head-movements are comparable to drivers and if this can be used for classification. In a driving-simulator-study (n=43, within-subject design), four scenarios were used to generate or deteriorate situation awareness (manipulation checked). Recurrent neural networks were trained with the resulting head-movements. Inference statistics were used to extract the discriminating feature, ensuring explainability. A very accurate classification was achieved and the mean side rotation-rate was identified as the most differentiating factor. Aware occupants behave more like drivers. Therefore, head-movements can be used to classify situation awareness in experimental settings but also in real driving.

ASJC Scopus subject areas

Cite this

Occupant Monitoring in Automated Vehicles: Classification of Situation Awareness Based on Head Movements While Cornering. / Schewe, Frederik; Cheng, Hao; Hafner, Alexander et al.
In: Proceedings of the Human Factors and Ergonomics Society Annual Meeting, Vol. 63, No. 1, 20.11.2019, p. 2078-2082.

Research output: Contribution to journalConference articleResearchpeer review

Schewe, F, Cheng, H, Hafner, A, Sester, M & Vollrath, M 2019, 'Occupant Monitoring in Automated Vehicles: Classification of Situation Awareness Based on Head Movements While Cornering', Proceedings of the Human Factors and Ergonomics Society Annual Meeting, vol. 63, no. 1, pp. 2078-2082. https://doi.org/10.1177/1071181319631048
Schewe, F., Cheng, H., Hafner, A., Sester, M., & Vollrath, M. (2019). Occupant Monitoring in Automated Vehicles: Classification of Situation Awareness Based on Head Movements While Cornering. Proceedings of the Human Factors and Ergonomics Society Annual Meeting, 63(1), 2078-2082. https://doi.org/10.1177/1071181319631048
Schewe F, Cheng H, Hafner A, Sester M, Vollrath M. Occupant Monitoring in Automated Vehicles: Classification of Situation Awareness Based on Head Movements While Cornering. Proceedings of the Human Factors and Ergonomics Society Annual Meeting. 2019 Nov 20;63(1):2078-2082. doi: 10.1177/1071181319631048
Schewe, Frederik ; Cheng, Hao ; Hafner, Alexander et al. / Occupant Monitoring in Automated Vehicles : Classification of Situation Awareness Based on Head Movements While Cornering. In: Proceedings of the Human Factors and Ergonomics Society Annual Meeting. 2019 ; Vol. 63, No. 1. pp. 2078-2082.
Download
@article{c42cb5e916e24931a5670455bfdd6896,
title = "Occupant Monitoring in Automated Vehicles: Classification of Situation Awareness Based on Head Movements While Cornering",
abstract = "We tested whether head-movements under automated driving can be used to classify a vehicle occupant as either situation-aware or unaware. While manually cornering, an active driver{\textquoteright}s head tilt correlates with the road angle which serves as a visual reference, whereas an inactive passenger{\textquoteright}s head follows the g-forces. Transferred to partial/conditional automation, the question arises whether aware occupant{\textquoteright}s head-movements are comparable to drivers and if this can be used for classification. In a driving-simulator-study (n=43, within-subject design), four scenarios were used to generate or deteriorate situation awareness (manipulation checked). Recurrent neural networks were trained with the resulting head-movements. Inference statistics were used to extract the discriminating feature, ensuring explainability. A very accurate classification was achieved and the mean side rotation-rate was identified as the most differentiating factor. Aware occupants behave more like drivers. Therefore, head-movements can be used to classify situation awareness in experimental settings but also in real driving.",
author = "Frederik Schewe and Hao Cheng and Alexander Hafner and Monika Sester and Mark Vollrath",
note = "Publisher Copyright: {\textcopyright} 2019 by Human Factors and Ergonomics Society.; Human Factors and Ergonomics Society Annual Meeting 2019 ; Conference date: 28-10-2019 Through 01-11-2019",
year = "2019",
month = nov,
day = "20",
doi = "10.1177/1071181319631048",
language = "English",
volume = "63",
pages = "2078--2082",
number = "1",

}

Download

TY - JOUR

T1 - Occupant Monitoring in Automated Vehicles

T2 - Human Factors and Ergonomics Society Annual Meeting 2019

AU - Schewe, Frederik

AU - Cheng, Hao

AU - Hafner, Alexander

AU - Sester, Monika

AU - Vollrath, Mark

N1 - Publisher Copyright: © 2019 by Human Factors and Ergonomics Society.

PY - 2019/11/20

Y1 - 2019/11/20

N2 - We tested whether head-movements under automated driving can be used to classify a vehicle occupant as either situation-aware or unaware. While manually cornering, an active driver’s head tilt correlates with the road angle which serves as a visual reference, whereas an inactive passenger’s head follows the g-forces. Transferred to partial/conditional automation, the question arises whether aware occupant’s head-movements are comparable to drivers and if this can be used for classification. In a driving-simulator-study (n=43, within-subject design), four scenarios were used to generate or deteriorate situation awareness (manipulation checked). Recurrent neural networks were trained with the resulting head-movements. Inference statistics were used to extract the discriminating feature, ensuring explainability. A very accurate classification was achieved and the mean side rotation-rate was identified as the most differentiating factor. Aware occupants behave more like drivers. Therefore, head-movements can be used to classify situation awareness in experimental settings but also in real driving.

AB - We tested whether head-movements under automated driving can be used to classify a vehicle occupant as either situation-aware or unaware. While manually cornering, an active driver’s head tilt correlates with the road angle which serves as a visual reference, whereas an inactive passenger’s head follows the g-forces. Transferred to partial/conditional automation, the question arises whether aware occupant’s head-movements are comparable to drivers and if this can be used for classification. In a driving-simulator-study (n=43, within-subject design), four scenarios were used to generate or deteriorate situation awareness (manipulation checked). Recurrent neural networks were trained with the resulting head-movements. Inference statistics were used to extract the discriminating feature, ensuring explainability. A very accurate classification was achieved and the mean side rotation-rate was identified as the most differentiating factor. Aware occupants behave more like drivers. Therefore, head-movements can be used to classify situation awareness in experimental settings but also in real driving.

UR - http://www.scopus.com/inward/record.url?scp=85099878690&partnerID=8YFLogxK

U2 - 10.1177/1071181319631048

DO - 10.1177/1071181319631048

M3 - Conference article

VL - 63

SP - 2078

EP - 2082

JO - Proceedings of the Human Factors and Ergonomics Society Annual Meeting

JF - Proceedings of the Human Factors and Ergonomics Society Annual Meeting

SN - 1541-9312

IS - 1

Y2 - 28 October 2019 through 1 November 2019

ER -

By the same author(s)