Using Hands and Feet to Navigate and Manipulate Spatial Data

Research output: Chapter in book/report/conference proceedingConference contributionResearchpeer review

Authors

  • Johannes Schöning
  • Michael Rohs
  • Florian Daiber
  • Antonio Krüger

External Research Organisations

  • University of Münster
  • Technische Universität Berlin
View graph of relations

Details

Original languageEnglish
Title of host publicationCHI EA '09
Subtitle of host publicationCHI '09 Extended Abstracts on Human Factors in Computing Systems
Pages4663-4668
Number of pages6
Publication statusPublished - 4 Apr 2009
Externally publishedYes
Event27th International Conference Extended Abstracts on Human Factors in Computing Systems, CHI 2009 - Boston, MA, United States
Duration: 4 Apr 20099 Apr 2009

Abstract

We demonstrate how multi-touch hand gestures in combination with foot gestures can be used to perform navigation tasks in interactive systems. The geospatial domain is an interesting example to show the advantages of the combination of both modalities because the complex user interfaces of common Geographic Information System (GIS) requires a high degree of expertise from its users. Recent developments in interactive surfaces that enable the construction of low cost multi-touch displays and relatively cheap sensor technology to detect foot gestures allow the deep exploration of these input modalities for GIS users with medium or low expertise. In this paper, we provide a categorization of multi-touch hand and foot gestures for the interaction with spatial data on a large-scale interactive wall. In addition we show with an initial evaluation how these gestures can improve the overall interaction with spatial information.

Keywords

    Foot interaction, Geographic information system (GIS), Large screens, Multi-touch, Spatial data, Tangible interfaces

ASJC Scopus subject areas

Cite this

Using Hands and Feet to Navigate and Manipulate Spatial Data. / Schöning, Johannes; Rohs, Michael; Daiber, Florian et al.
CHI EA '09: CHI '09 Extended Abstracts on Human Factors in Computing Systems. 2009. p. 4663-4668.

Research output: Chapter in book/report/conference proceedingConference contributionResearchpeer review

Schöning, J, Rohs, M, Daiber, F & Krüger, A 2009, Using Hands and Feet to Navigate and Manipulate Spatial Data. in CHI EA '09: CHI '09 Extended Abstracts on Human Factors in Computing Systems. pp. 4663-4668, 27th International Conference Extended Abstracts on Human Factors in Computing Systems, CHI 2009, Boston, MA, United States, 4 Apr 2009. https://doi.org/10.1145/1520340.1520717
Schöning, J., Rohs, M., Daiber, F., & Krüger, A. (2009). Using Hands and Feet to Navigate and Manipulate Spatial Data. In CHI EA '09: CHI '09 Extended Abstracts on Human Factors in Computing Systems (pp. 4663-4668) https://doi.org/10.1145/1520340.1520717
Schöning J, Rohs M, Daiber F, Krüger A. Using Hands and Feet to Navigate and Manipulate Spatial Data. In CHI EA '09: CHI '09 Extended Abstracts on Human Factors in Computing Systems. 2009. p. 4663-4668 doi: 10.1145/1520340.1520717
Schöning, Johannes ; Rohs, Michael ; Daiber, Florian et al. / Using Hands and Feet to Navigate and Manipulate Spatial Data. CHI EA '09: CHI '09 Extended Abstracts on Human Factors in Computing Systems. 2009. pp. 4663-4668
Download
@inproceedings{f3f3d9bbd6524e7c8e1d3f5f9f11db8a,
title = "Using Hands and Feet to Navigate and Manipulate Spatial Data",
abstract = "We demonstrate how multi-touch hand gestures in combination with foot gestures can be used to perform navigation tasks in interactive systems. The geospatial domain is an interesting example to show the advantages of the combination of both modalities because the complex user interfaces of common Geographic Information System (GIS) requires a high degree of expertise from its users. Recent developments in interactive surfaces that enable the construction of low cost multi-touch displays and relatively cheap sensor technology to detect foot gestures allow the deep exploration of these input modalities for GIS users with medium or low expertise. In this paper, we provide a categorization of multi-touch hand and foot gestures for the interaction with spatial data on a large-scale interactive wall. In addition we show with an initial evaluation how these gestures can improve the overall interaction with spatial information.",
keywords = "Foot interaction, Geographic information system (GIS), Large screens, Multi-touch, Spatial data, Tangible interfaces",
author = "Johannes Sch{\"o}ning and Michael Rohs and Florian Daiber and Antonio Kr{\"u}ger",
note = "Copyright: Copyright 2009 Elsevier B.V., All rights reserved.; 27th International Conference Extended Abstracts on Human Factors in Computing Systems, CHI 2009 ; Conference date: 04-04-2009 Through 09-04-2009",
year = "2009",
month = apr,
day = "4",
doi = "10.1145/1520340.1520717",
language = "English",
isbn = "9781605582474",
pages = "4663--4668",
booktitle = "CHI EA '09",

}

Download

TY - GEN

T1 - Using Hands and Feet to Navigate and Manipulate Spatial Data

AU - Schöning, Johannes

AU - Rohs, Michael

AU - Daiber, Florian

AU - Krüger, Antonio

N1 - Copyright: Copyright 2009 Elsevier B.V., All rights reserved.

PY - 2009/4/4

Y1 - 2009/4/4

N2 - We demonstrate how multi-touch hand gestures in combination with foot gestures can be used to perform navigation tasks in interactive systems. The geospatial domain is an interesting example to show the advantages of the combination of both modalities because the complex user interfaces of common Geographic Information System (GIS) requires a high degree of expertise from its users. Recent developments in interactive surfaces that enable the construction of low cost multi-touch displays and relatively cheap sensor technology to detect foot gestures allow the deep exploration of these input modalities for GIS users with medium or low expertise. In this paper, we provide a categorization of multi-touch hand and foot gestures for the interaction with spatial data on a large-scale interactive wall. In addition we show with an initial evaluation how these gestures can improve the overall interaction with spatial information.

AB - We demonstrate how multi-touch hand gestures in combination with foot gestures can be used to perform navigation tasks in interactive systems. The geospatial domain is an interesting example to show the advantages of the combination of both modalities because the complex user interfaces of common Geographic Information System (GIS) requires a high degree of expertise from its users. Recent developments in interactive surfaces that enable the construction of low cost multi-touch displays and relatively cheap sensor technology to detect foot gestures allow the deep exploration of these input modalities for GIS users with medium or low expertise. In this paper, we provide a categorization of multi-touch hand and foot gestures for the interaction with spatial data on a large-scale interactive wall. In addition we show with an initial evaluation how these gestures can improve the overall interaction with spatial information.

KW - Foot interaction

KW - Geographic information system (GIS)

KW - Large screens

KW - Multi-touch

KW - Spatial data

KW - Tangible interfaces

UR - http://www.scopus.com/inward/record.url?scp=70349289940&partnerID=8YFLogxK

U2 - 10.1145/1520340.1520717

DO - 10.1145/1520340.1520717

M3 - Conference contribution

AN - SCOPUS:70349289940

SN - 9781605582474

SP - 4663

EP - 4668

BT - CHI EA '09

T2 - 27th International Conference Extended Abstracts on Human Factors in Computing Systems, CHI 2009

Y2 - 4 April 2009 through 9 April 2009

ER -