Using Hands and Feet to Navigate and Manipulate Spatial Data

Publikation: Beitrag in Buch/Bericht/Sammelwerk/KonferenzbandAufsatz in KonferenzbandForschungPeer-Review

Autoren

  • Johannes Schöning
  • Michael Rohs
  • Florian Daiber
  • Antonio Krüger

Externe Organisationen

  • Westfälische Wilhelms-Universität Münster (WWU)
  • Technische Universität Berlin
Forschungs-netzwerk anzeigen

Details

OriginalspracheEnglisch
Titel des SammelwerksCHI EA '09
UntertitelCHI '09 Extended Abstracts on Human Factors in Computing Systems
Seiten4663-4668
Seitenumfang6
PublikationsstatusVeröffentlicht - 4 Apr. 2009
Extern publiziertJa
Veranstaltung27th International Conference Extended Abstracts on Human Factors in Computing Systems, CHI 2009 - Boston, MA, USA / Vereinigte Staaten
Dauer: 4 Apr. 20099 Apr. 2009

Abstract

We demonstrate how multi-touch hand gestures in combination with foot gestures can be used to perform navigation tasks in interactive systems. The geospatial domain is an interesting example to show the advantages of the combination of both modalities because the complex user interfaces of common Geographic Information System (GIS) requires a high degree of expertise from its users. Recent developments in interactive surfaces that enable the construction of low cost multi-touch displays and relatively cheap sensor technology to detect foot gestures allow the deep exploration of these input modalities for GIS users with medium or low expertise. In this paper, we provide a categorization of multi-touch hand and foot gestures for the interaction with spatial data on a large-scale interactive wall. In addition we show with an initial evaluation how these gestures can improve the overall interaction with spatial information.

ASJC Scopus Sachgebiete

Zitieren

Using Hands and Feet to Navigate and Manipulate Spatial Data. / Schöning, Johannes; Rohs, Michael; Daiber, Florian et al.
CHI EA '09: CHI '09 Extended Abstracts on Human Factors in Computing Systems. 2009. S. 4663-4668.

Publikation: Beitrag in Buch/Bericht/Sammelwerk/KonferenzbandAufsatz in KonferenzbandForschungPeer-Review

Schöning, J, Rohs, M, Daiber, F & Krüger, A 2009, Using Hands and Feet to Navigate and Manipulate Spatial Data. in CHI EA '09: CHI '09 Extended Abstracts on Human Factors in Computing Systems. S. 4663-4668, 27th International Conference Extended Abstracts on Human Factors in Computing Systems, CHI 2009, Boston, MA, USA / Vereinigte Staaten, 4 Apr. 2009. https://doi.org/10.1145/1520340.1520717
Schöning, J., Rohs, M., Daiber, F., & Krüger, A. (2009). Using Hands and Feet to Navigate and Manipulate Spatial Data. In CHI EA '09: CHI '09 Extended Abstracts on Human Factors in Computing Systems (S. 4663-4668) https://doi.org/10.1145/1520340.1520717
Schöning J, Rohs M, Daiber F, Krüger A. Using Hands and Feet to Navigate and Manipulate Spatial Data. in CHI EA '09: CHI '09 Extended Abstracts on Human Factors in Computing Systems. 2009. S. 4663-4668 doi: 10.1145/1520340.1520717
Schöning, Johannes ; Rohs, Michael ; Daiber, Florian et al. / Using Hands and Feet to Navigate and Manipulate Spatial Data. CHI EA '09: CHI '09 Extended Abstracts on Human Factors in Computing Systems. 2009. S. 4663-4668
Download
@inproceedings{f3f3d9bbd6524e7c8e1d3f5f9f11db8a,
title = "Using Hands and Feet to Navigate and Manipulate Spatial Data",
abstract = "We demonstrate how multi-touch hand gestures in combination with foot gestures can be used to perform navigation tasks in interactive systems. The geospatial domain is an interesting example to show the advantages of the combination of both modalities because the complex user interfaces of common Geographic Information System (GIS) requires a high degree of expertise from its users. Recent developments in interactive surfaces that enable the construction of low cost multi-touch displays and relatively cheap sensor technology to detect foot gestures allow the deep exploration of these input modalities for GIS users with medium or low expertise. In this paper, we provide a categorization of multi-touch hand and foot gestures for the interaction with spatial data on a large-scale interactive wall. In addition we show with an initial evaluation how these gestures can improve the overall interaction with spatial information.",
keywords = "Foot interaction, Geographic information system (GIS), Large screens, Multi-touch, Spatial data, Tangible interfaces",
author = "Johannes Sch{\"o}ning and Michael Rohs and Florian Daiber and Antonio Kr{\"u}ger",
note = "Copyright: Copyright 2009 Elsevier B.V., All rights reserved.; 27th International Conference Extended Abstracts on Human Factors in Computing Systems, CHI 2009 ; Conference date: 04-04-2009 Through 09-04-2009",
year = "2009",
month = apr,
day = "4",
doi = "10.1145/1520340.1520717",
language = "English",
isbn = "9781605582474",
pages = "4663--4668",
booktitle = "CHI EA '09",

}

Download

TY - GEN

T1 - Using Hands and Feet to Navigate and Manipulate Spatial Data

AU - Schöning, Johannes

AU - Rohs, Michael

AU - Daiber, Florian

AU - Krüger, Antonio

N1 - Copyright: Copyright 2009 Elsevier B.V., All rights reserved.

PY - 2009/4/4

Y1 - 2009/4/4

N2 - We demonstrate how multi-touch hand gestures in combination with foot gestures can be used to perform navigation tasks in interactive systems. The geospatial domain is an interesting example to show the advantages of the combination of both modalities because the complex user interfaces of common Geographic Information System (GIS) requires a high degree of expertise from its users. Recent developments in interactive surfaces that enable the construction of low cost multi-touch displays and relatively cheap sensor technology to detect foot gestures allow the deep exploration of these input modalities for GIS users with medium or low expertise. In this paper, we provide a categorization of multi-touch hand and foot gestures for the interaction with spatial data on a large-scale interactive wall. In addition we show with an initial evaluation how these gestures can improve the overall interaction with spatial information.

AB - We demonstrate how multi-touch hand gestures in combination with foot gestures can be used to perform navigation tasks in interactive systems. The geospatial domain is an interesting example to show the advantages of the combination of both modalities because the complex user interfaces of common Geographic Information System (GIS) requires a high degree of expertise from its users. Recent developments in interactive surfaces that enable the construction of low cost multi-touch displays and relatively cheap sensor technology to detect foot gestures allow the deep exploration of these input modalities for GIS users with medium or low expertise. In this paper, we provide a categorization of multi-touch hand and foot gestures for the interaction with spatial data on a large-scale interactive wall. In addition we show with an initial evaluation how these gestures can improve the overall interaction with spatial information.

KW - Foot interaction

KW - Geographic information system (GIS)

KW - Large screens

KW - Multi-touch

KW - Spatial data

KW - Tangible interfaces

UR - http://www.scopus.com/inward/record.url?scp=70349289940&partnerID=8YFLogxK

U2 - 10.1145/1520340.1520717

DO - 10.1145/1520340.1520717

M3 - Conference contribution

AN - SCOPUS:70349289940

SN - 9781605582474

SP - 4663

EP - 4668

BT - CHI EA '09

T2 - 27th International Conference Extended Abstracts on Human Factors in Computing Systems, CHI 2009

Y2 - 4 April 2009 through 9 April 2009

ER -