Details
Originalsprache | Englisch |
---|---|
Titel des Sammelwerks | CHI '17 |
Untertitel | Proceedings of the 2017 ACM SIGCHI Conference on Human Factors in Computing Systems |
Herausgeber (Verlag) | Association for Computing Machinery (ACM) |
Seiten | 6133-6146 |
Seitenumfang | 14 |
ISBN (elektronisch) | 9781450346559 |
Publikationsstatus | Veröffentlicht - 2 Mai 2017 |
Veranstaltung | 2017 ACM SIGCHI Conference on Human Factors in Computing Systems, CHI 2017 - Denver, USA / Vereinigte Staaten Dauer: 6 Mai 2017 → 11 Mai 2017 |
Abstract
The human body reveals emotional and bodily states through measurable signals, such as body language and electroen-cephalography. However, such manifestations are difficult to communicate to others remotely. We propose EmotionActua-tor, a proof-of-concept system to investigate the transmission of emotional states in which the recipient performs emotional gestures to understand and interpret the state of the sender. We call this kind of communication embodied emotional feedback, and present a prototype implementation. To realize our concept we chose four emotional states: amused, sad, angry, and neutral. We designed EmotionActuator through a series of studies to assess emotional classification via EEG, and create an EMS gesture set by comparing composed gestures from the literature to sign-language gestures. Through a final study with the end-to-end prototype interviews revealed that participants like implicit sharing of emotions and find the embodied output to be immersive, but want to have control over shared emotions and with whom. This work contributes a proof of concept system and set of design recommendations for designing embodied emotional feedback systems.
ASJC Scopus Sachgebiete
- Informatik (insg.)
- Software
- Informatik (insg.)
- Mensch-Maschine-Interaktion
- Informatik (insg.)
- Computergrafik und computergestütztes Design
Zitieren
- Standard
- Harvard
- Apa
- Vancouver
- BibTex
- RIS
CHI '17: Proceedings of the 2017 ACM SIGCHI Conference on Human Factors in Computing Systems. Association for Computing Machinery (ACM), 2017. S. 6133-6146.
Publikation: Beitrag in Buch/Bericht/Sammelwerk/Konferenzband › Aufsatz in Konferenzband › Forschung › Peer-Review
}
TY - GEN
T1 - Emotion actuator
T2 - 2017 ACM SIGCHI Conference on Human Factors in Computing Systems, CHI 2017
AU - Hassib, Mariam
AU - Pfeiffer, Max
AU - Schneegass, Stefan
AU - Rohs, Michael
AU - Alt, Florian
N1 - Funding Information: This orkw asw partly conducted within the Amplify project which funding from the European Research Funding Information: Council (ERC) under the European sUnion’ Horizon 2020 research and ationvinno programme (grant agreement no. 683008). Publisher Copyright: © 2017 ACM. Copyright: Copyright 2018 Elsevier B.V., All rights reserved.
PY - 2017/5/2
Y1 - 2017/5/2
N2 - The human body reveals emotional and bodily states through measurable signals, such as body language and electroen-cephalography. However, such manifestations are difficult to communicate to others remotely. We propose EmotionActua-tor, a proof-of-concept system to investigate the transmission of emotional states in which the recipient performs emotional gestures to understand and interpret the state of the sender. We call this kind of communication embodied emotional feedback, and present a prototype implementation. To realize our concept we chose four emotional states: amused, sad, angry, and neutral. We designed EmotionActuator through a series of studies to assess emotional classification via EEG, and create an EMS gesture set by comparing composed gestures from the literature to sign-language gestures. Through a final study with the end-to-end prototype interviews revealed that participants like implicit sharing of emotions and find the embodied output to be immersive, but want to have control over shared emotions and with whom. This work contributes a proof of concept system and set of design recommendations for designing embodied emotional feedback systems.
AB - The human body reveals emotional and bodily states through measurable signals, such as body language and electroen-cephalography. However, such manifestations are difficult to communicate to others remotely. We propose EmotionActua-tor, a proof-of-concept system to investigate the transmission of emotional states in which the recipient performs emotional gestures to understand and interpret the state of the sender. We call this kind of communication embodied emotional feedback, and present a prototype implementation. To realize our concept we chose four emotional states: amused, sad, angry, and neutral. We designed EmotionActuator through a series of studies to assess emotional classification via EEG, and create an EMS gesture set by comparing composed gestures from the literature to sign-language gestures. Through a final study with the end-to-end prototype interviews revealed that participants like implicit sharing of emotions and find the embodied output to be immersive, but want to have control over shared emotions and with whom. This work contributes a proof of concept system and set of design recommendations for designing embodied emotional feedback systems.
KW - Affect display
KW - Affective computing
KW - EEG
KW - Emotion
KW - Emotion sharing
KW - EMS
UR - http://www.scopus.com/inward/record.url?scp=85030326875&partnerID=8YFLogxK
U2 - 10.1145/3025453.3025953
DO - 10.1145/3025453.3025953
M3 - Conference contribution
AN - SCOPUS:85030326875
SP - 6133
EP - 6146
BT - CHI '17
PB - Association for Computing Machinery (ACM)
Y2 - 6 May 2017 through 11 May 2017
ER -