Details
Original language | English |
---|---|
Title of host publication | MobileHCI '10 |
Subtitle of host publication | Proceedings of the 12th international conference on Human computer interaction with mobile devices and services |
Pages | 239-248 |
Number of pages | 10 |
Publication status | Published - 7 Sept 2010 |
Externally published | Yes |
Event | 12th International Conference on Human-Computer Interaction with Mobile Devices and Services, Mobile HCI2010 - Lisbon, Portugal Duration: 7 Sept 2010 → 10 Sept 2010 |
Abstract
Gestures can offer an intuitive way to interact with a computer. In this paper, we investigate the question whether gesturing with a mobile phone can help to perform complex tasks involving two devices. We present results from a user study, where we asked participants to spontaneously produce gestures with their phone to trigger a set of different activities. We investigated three conditions (device configurations): phone-to-phone, phone-to-tabletop, and phone to public display. We report on the kinds of gestures we observed as well as on feedback from the participants, and provide an initial assessment of which sensors might facilitate gesture recognition in a phone. The results suggest that phone gestures have the potential to be easily understood by end users and that certain device configurations and activities may be well suited for gesture control.
Keywords
- device pairing, gesture, large display, mobile phone, multi-device interaction, tabletop, user-defined gesture
ASJC Scopus subject areas
- Computer Science(all)
- Software
- Computer Science(all)
- Human-Computer Interaction
- Computer Science(all)
- Computer Vision and Pattern Recognition
- Computer Science(all)
- Computer Networks and Communications
Cite this
- Standard
- Harvard
- Apa
- Vancouver
- BibTeX
- RIS
MobileHCI '10: Proceedings of the 12th international conference on Human computer interaction with mobile devices and services. 2010. p. 239-248.
Research output: Chapter in book/report/conference proceeding › Conference contribution › Research › peer review
}
TY - GEN
T1 - User-Defined Gestures for Connecting Mobile Phones, Public Displays, and Tabletops
AU - Kray, Christian
AU - Nesbitt, Daniel
AU - Dawson, John
AU - Rohs, Michael
N1 - Copyright: Copyright 2010 Elsevier B.V., All rights reserved.
PY - 2010/9/7
Y1 - 2010/9/7
N2 - Gestures can offer an intuitive way to interact with a computer. In this paper, we investigate the question whether gesturing with a mobile phone can help to perform complex tasks involving two devices. We present results from a user study, where we asked participants to spontaneously produce gestures with their phone to trigger a set of different activities. We investigated three conditions (device configurations): phone-to-phone, phone-to-tabletop, and phone to public display. We report on the kinds of gestures we observed as well as on feedback from the participants, and provide an initial assessment of which sensors might facilitate gesture recognition in a phone. The results suggest that phone gestures have the potential to be easily understood by end users and that certain device configurations and activities may be well suited for gesture control.
AB - Gestures can offer an intuitive way to interact with a computer. In this paper, we investigate the question whether gesturing with a mobile phone can help to perform complex tasks involving two devices. We present results from a user study, where we asked participants to spontaneously produce gestures with their phone to trigger a set of different activities. We investigated three conditions (device configurations): phone-to-phone, phone-to-tabletop, and phone to public display. We report on the kinds of gestures we observed as well as on feedback from the participants, and provide an initial assessment of which sensors might facilitate gesture recognition in a phone. The results suggest that phone gestures have the potential to be easily understood by end users and that certain device configurations and activities may be well suited for gesture control.
KW - device pairing
KW - gesture
KW - large display
KW - mobile phone
KW - multi-device interaction
KW - tabletop
KW - user-defined gesture
UR - http://www.scopus.com/inward/record.url?scp=78249289948&partnerID=8YFLogxK
U2 - 10.1145/1851600.1851640
DO - 10.1145/1851600.1851640
M3 - Conference contribution
AN - SCOPUS:78249289948
SN - 9781605588353
SP - 239
EP - 248
BT - MobileHCI '10
T2 - 12th International Conference on Human-Computer Interaction with Mobile Devices and Services, Mobile HCI2010
Y2 - 7 September 2010 through 10 September 2010
ER -