User-Defined Gestures for Connecting Mobile Phones, Public Displays, and Tabletops

Research output: Chapter in book/report/conference proceedingConference contributionResearchpeer review

Authors

  • Christian Kray
  • Daniel Nesbitt
  • John Dawson
  • Michael Rohs

External Research Organisations

  • Newcastle University
  • Technische Universität Berlin
View graph of relations

Details

Original languageEnglish
Title of host publicationMobileHCI '10
Subtitle of host publicationProceedings of the 12th international conference on Human computer interaction with mobile devices and services
Pages239-248
Number of pages10
Publication statusPublished - 7 Sept 2010
Externally publishedYes
Event12th International Conference on Human-Computer Interaction with Mobile Devices and Services, Mobile HCI2010 - Lisbon, Portugal
Duration: 7 Sept 201010 Sept 2010

Abstract

Gestures can offer an intuitive way to interact with a computer. In this paper, we investigate the question whether gesturing with a mobile phone can help to perform complex tasks involving two devices. We present results from a user study, where we asked participants to spontaneously produce gestures with their phone to trigger a set of different activities. We investigated three conditions (device configurations): phone-to-phone, phone-to-tabletop, and phone to public display. We report on the kinds of gestures we observed as well as on feedback from the participants, and provide an initial assessment of which sensors might facilitate gesture recognition in a phone. The results suggest that phone gestures have the potential to be easily understood by end users and that certain device configurations and activities may be well suited for gesture control.

Keywords

    device pairing, gesture, large display, mobile phone, multi-device interaction, tabletop, user-defined gesture

ASJC Scopus subject areas

Cite this

User-Defined Gestures for Connecting Mobile Phones, Public Displays, and Tabletops. / Kray, Christian; Nesbitt, Daniel; Dawson, John et al.
MobileHCI '10: Proceedings of the 12th international conference on Human computer interaction with mobile devices and services. 2010. p. 239-248.

Research output: Chapter in book/report/conference proceedingConference contributionResearchpeer review

Kray, C, Nesbitt, D, Dawson, J & Rohs, M 2010, User-Defined Gestures for Connecting Mobile Phones, Public Displays, and Tabletops. in MobileHCI '10: Proceedings of the 12th international conference on Human computer interaction with mobile devices and services. pp. 239-248, 12th International Conference on Human-Computer Interaction with Mobile Devices and Services, Mobile HCI2010, Lisbon, Portugal, 7 Sept 2010. https://doi.org/10.1145/1851600.1851640
Kray, C., Nesbitt, D., Dawson, J., & Rohs, M. (2010). User-Defined Gestures for Connecting Mobile Phones, Public Displays, and Tabletops. In MobileHCI '10: Proceedings of the 12th international conference on Human computer interaction with mobile devices and services (pp. 239-248) https://doi.org/10.1145/1851600.1851640
Kray C, Nesbitt D, Dawson J, Rohs M. User-Defined Gestures for Connecting Mobile Phones, Public Displays, and Tabletops. In MobileHCI '10: Proceedings of the 12th international conference on Human computer interaction with mobile devices and services. 2010. p. 239-248 doi: 10.1145/1851600.1851640
Kray, Christian ; Nesbitt, Daniel ; Dawson, John et al. / User-Defined Gestures for Connecting Mobile Phones, Public Displays, and Tabletops. MobileHCI '10: Proceedings of the 12th international conference on Human computer interaction with mobile devices and services. 2010. pp. 239-248
Download
@inproceedings{5731a86cc7434a4cabad4ae5f0ee170f,
title = "User-Defined Gestures for Connecting Mobile Phones, Public Displays, and Tabletops",
abstract = "Gestures can offer an intuitive way to interact with a computer. In this paper, we investigate the question whether gesturing with a mobile phone can help to perform complex tasks involving two devices. We present results from a user study, where we asked participants to spontaneously produce gestures with their phone to trigger a set of different activities. We investigated three conditions (device configurations): phone-to-phone, phone-to-tabletop, and phone to public display. We report on the kinds of gestures we observed as well as on feedback from the participants, and provide an initial assessment of which sensors might facilitate gesture recognition in a phone. The results suggest that phone gestures have the potential to be easily understood by end users and that certain device configurations and activities may be well suited for gesture control.",
keywords = "device pairing, gesture, large display, mobile phone, multi-device interaction, tabletop, user-defined gesture",
author = "Christian Kray and Daniel Nesbitt and John Dawson and Michael Rohs",
note = "Copyright: Copyright 2010 Elsevier B.V., All rights reserved.; 12th International Conference on Human-Computer Interaction with Mobile Devices and Services, Mobile HCI2010 ; Conference date: 07-09-2010 Through 10-09-2010",
year = "2010",
month = sep,
day = "7",
doi = "10.1145/1851600.1851640",
language = "English",
isbn = "9781605588353",
pages = "239--248",
booktitle = "MobileHCI '10",

}

Download

TY - GEN

T1 - User-Defined Gestures for Connecting Mobile Phones, Public Displays, and Tabletops

AU - Kray, Christian

AU - Nesbitt, Daniel

AU - Dawson, John

AU - Rohs, Michael

N1 - Copyright: Copyright 2010 Elsevier B.V., All rights reserved.

PY - 2010/9/7

Y1 - 2010/9/7

N2 - Gestures can offer an intuitive way to interact with a computer. In this paper, we investigate the question whether gesturing with a mobile phone can help to perform complex tasks involving two devices. We present results from a user study, where we asked participants to spontaneously produce gestures with their phone to trigger a set of different activities. We investigated three conditions (device configurations): phone-to-phone, phone-to-tabletop, and phone to public display. We report on the kinds of gestures we observed as well as on feedback from the participants, and provide an initial assessment of which sensors might facilitate gesture recognition in a phone. The results suggest that phone gestures have the potential to be easily understood by end users and that certain device configurations and activities may be well suited for gesture control.

AB - Gestures can offer an intuitive way to interact with a computer. In this paper, we investigate the question whether gesturing with a mobile phone can help to perform complex tasks involving two devices. We present results from a user study, where we asked participants to spontaneously produce gestures with their phone to trigger a set of different activities. We investigated three conditions (device configurations): phone-to-phone, phone-to-tabletop, and phone to public display. We report on the kinds of gestures we observed as well as on feedback from the participants, and provide an initial assessment of which sensors might facilitate gesture recognition in a phone. The results suggest that phone gestures have the potential to be easily understood by end users and that certain device configurations and activities may be well suited for gesture control.

KW - device pairing

KW - gesture

KW - large display

KW - mobile phone

KW - multi-device interaction

KW - tabletop

KW - user-defined gesture

UR - http://www.scopus.com/inward/record.url?scp=78249289948&partnerID=8YFLogxK

U2 - 10.1145/1851600.1851640

DO - 10.1145/1851600.1851640

M3 - Conference contribution

AN - SCOPUS:78249289948

SN - 9781605588353

SP - 239

EP - 248

BT - MobileHCI '10

T2 - 12th International Conference on Human-Computer Interaction with Mobile Devices and Services, Mobile HCI2010

Y2 - 7 September 2010 through 10 September 2010

ER -