Details
Original language | English |
---|---|
Title of host publication | CHI '17 |
Subtitle of host publication | Proceedings of the 2017 ACM SIGCHI Conference on Human Factors in Computing Systems |
Publisher | Association for Computing Machinery (ACM) |
Pages | 6133-6146 |
Number of pages | 14 |
ISBN (electronic) | 9781450346559 |
Publication status | Published - 2 May 2017 |
Event | 2017 ACM SIGCHI Conference on Human Factors in Computing Systems, CHI 2017 - Denver, United States Duration: 6 May 2017 → 11 May 2017 |
Abstract
The human body reveals emotional and bodily states through measurable signals, such as body language and electroen-cephalography. However, such manifestations are difficult to communicate to others remotely. We propose EmotionActua-tor, a proof-of-concept system to investigate the transmission of emotional states in which the recipient performs emotional gestures to understand and interpret the state of the sender. We call this kind of communication embodied emotional feedback, and present a prototype implementation. To realize our concept we chose four emotional states: amused, sad, angry, and neutral. We designed EmotionActuator through a series of studies to assess emotional classification via EEG, and create an EMS gesture set by comparing composed gestures from the literature to sign-language gestures. Through a final study with the end-to-end prototype interviews revealed that participants like implicit sharing of emotions and find the embodied output to be immersive, but want to have control over shared emotions and with whom. This work contributes a proof of concept system and set of design recommendations for designing embodied emotional feedback systems.
Keywords
- Affect display, Affective computing, EEG, Emotion, Emotion sharing, EMS
ASJC Scopus subject areas
- Computer Science(all)
- Software
- Computer Science(all)
- Human-Computer Interaction
- Computer Science(all)
- Computer Graphics and Computer-Aided Design
Cite this
- Standard
- Harvard
- Apa
- Vancouver
- BibTeX
- RIS
CHI '17: Proceedings of the 2017 ACM SIGCHI Conference on Human Factors in Computing Systems. Association for Computing Machinery (ACM), 2017. p. 6133-6146.
Research output: Chapter in book/report/conference proceeding › Conference contribution › Research › peer review
}
TY - GEN
T1 - Emotion actuator
T2 - 2017 ACM SIGCHI Conference on Human Factors in Computing Systems, CHI 2017
AU - Hassib, Mariam
AU - Pfeiffer, Max
AU - Schneegass, Stefan
AU - Rohs, Michael
AU - Alt, Florian
N1 - Funding Information: This orkw asw partly conducted within the Amplify project which funding from the European Research Funding Information: Council (ERC) under the European sUnion’ Horizon 2020 research and ationvinno programme (grant agreement no. 683008). Publisher Copyright: © 2017 ACM. Copyright: Copyright 2018 Elsevier B.V., All rights reserved.
PY - 2017/5/2
Y1 - 2017/5/2
N2 - The human body reveals emotional and bodily states through measurable signals, such as body language and electroen-cephalography. However, such manifestations are difficult to communicate to others remotely. We propose EmotionActua-tor, a proof-of-concept system to investigate the transmission of emotional states in which the recipient performs emotional gestures to understand and interpret the state of the sender. We call this kind of communication embodied emotional feedback, and present a prototype implementation. To realize our concept we chose four emotional states: amused, sad, angry, and neutral. We designed EmotionActuator through a series of studies to assess emotional classification via EEG, and create an EMS gesture set by comparing composed gestures from the literature to sign-language gestures. Through a final study with the end-to-end prototype interviews revealed that participants like implicit sharing of emotions and find the embodied output to be immersive, but want to have control over shared emotions and with whom. This work contributes a proof of concept system and set of design recommendations for designing embodied emotional feedback systems.
AB - The human body reveals emotional and bodily states through measurable signals, such as body language and electroen-cephalography. However, such manifestations are difficult to communicate to others remotely. We propose EmotionActua-tor, a proof-of-concept system to investigate the transmission of emotional states in which the recipient performs emotional gestures to understand and interpret the state of the sender. We call this kind of communication embodied emotional feedback, and present a prototype implementation. To realize our concept we chose four emotional states: amused, sad, angry, and neutral. We designed EmotionActuator through a series of studies to assess emotional classification via EEG, and create an EMS gesture set by comparing composed gestures from the literature to sign-language gestures. Through a final study with the end-to-end prototype interviews revealed that participants like implicit sharing of emotions and find the embodied output to be immersive, but want to have control over shared emotions and with whom. This work contributes a proof of concept system and set of design recommendations for designing embodied emotional feedback systems.
KW - Affect display
KW - Affective computing
KW - EEG
KW - Emotion
KW - Emotion sharing
KW - EMS
UR - http://www.scopus.com/inward/record.url?scp=85030326875&partnerID=8YFLogxK
U2 - 10.1145/3025453.3025953
DO - 10.1145/3025453.3025953
M3 - Conference contribution
AN - SCOPUS:85030326875
SP - 6133
EP - 6146
BT - CHI '17
PB - Association for Computing Machinery (ACM)
Y2 - 6 May 2017 through 11 May 2017
ER -