Details
Originalsprache | Englisch |
---|---|
Titel des Sammelwerks | TEI '13 |
Untertitel | Proceedings of the 7th International Conference on Tangible, Embedded and Embodied Interaction |
Seiten | 185-192 |
Seitenumfang | 8 |
Publikationsstatus | Veröffentlicht - 10 Feb. 2013 |
Veranstaltung | 7th ACM International Conference on Tangible, Embedded and Embodied Interaction, TEI 2013 - Barcelona, Spanien Dauer: 10 Feb. 2013 → 13 Feb. 2013 |
Abstract
We present a wearable interface that consists of motion sensors. As the interface can be worn on the user's fingers (as a ring) or fixed to it (with nail polish), the device controlled by finger gestures can be any generic object, provided they have an interface for receiving the sensor's signal. We implemented four gestures: tap, release, swipe, and pitch, all of which can be executed with a finger of the hand holding the device. In a user study we tested gesture appropriateness for the index finger at the back of a hand-held tablet that offered three different form factors on its rear: flat, convex, and concave (undercut). For all three shapes, the gesture performance was equally good, however pitch performed better on all surfaces than swipe. The proposed interface is an example towards the idea of ubiquitous computing and the vision of seamless interactions with grasped objects. As an initial application scenario we implemented a camera control that allows the brightness to be configured using our tested gestures on a common SLR device.
ASJC Scopus Sachgebiete
- Informatik (insg.)
- Mensch-Maschine-Interaktion
Zitieren
- Standard
- Harvard
- Apa
- Vancouver
- BibTex
- RIS
TEI '13: Proceedings of the 7th International Conference on Tangible, Embedded and Embodied Interaction. 2013. S. 185-192.
Publikation: Beitrag in Buch/Bericht/Sammelwerk/Konferenzband › Aufsatz in Konferenzband › Forschung › Peer-Review
}
TY - GEN
T1 - Tickle
T2 - 7th ACM International Conference on Tangible, Embedded and Embodied Interaction, TEI 2013
AU - Wolf, Katrin
AU - Schleicher, Robert
AU - Kratz, Sven
AU - Rohs, Michael
N1 - Copyright: Copyright 2013 Elsevier B.V., All rights reserved.
PY - 2013/2/10
Y1 - 2013/2/10
N2 - We present a wearable interface that consists of motion sensors. As the interface can be worn on the user's fingers (as a ring) or fixed to it (with nail polish), the device controlled by finger gestures can be any generic object, provided they have an interface for receiving the sensor's signal. We implemented four gestures: tap, release, swipe, and pitch, all of which can be executed with a finger of the hand holding the device. In a user study we tested gesture appropriateness for the index finger at the back of a hand-held tablet that offered three different form factors on its rear: flat, convex, and concave (undercut). For all three shapes, the gesture performance was equally good, however pitch performed better on all surfaces than swipe. The proposed interface is an example towards the idea of ubiquitous computing and the vision of seamless interactions with grasped objects. As an initial application scenario we implemented a camera control that allows the brightness to be configured using our tested gestures on a common SLR device.
AB - We present a wearable interface that consists of motion sensors. As the interface can be worn on the user's fingers (as a ring) or fixed to it (with nail polish), the device controlled by finger gestures can be any generic object, provided they have an interface for receiving the sensor's signal. We implemented four gestures: tap, release, swipe, and pitch, all of which can be executed with a finger of the hand holding the device. In a user study we tested gesture appropriateness for the index finger at the back of a hand-held tablet that offered three different form factors on its rear: flat, convex, and concave (undercut). For all three shapes, the gesture performance was equally good, however pitch performed better on all surfaces than swipe. The proposed interface is an example towards the idea of ubiquitous computing and the vision of seamless interactions with grasped objects. As an initial application scenario we implemented a camera control that allows the brightness to be configured using our tested gestures on a common SLR device.
KW - Back-of-device
KW - Gesture
KW - Grasp
KW - Pitch
KW - Rear
KW - Swipe
UR - http://www.scopus.com/inward/record.url?scp=84876874957&partnerID=8YFLogxK
U2 - 10.1145/2460625.2460654
DO - 10.1145/2460625.2460654
M3 - Conference contribution
AN - SCOPUS:84876874957
SN - 9781450318983
SP - 185
EP - 192
BT - TEI '13
Y2 - 10 February 2013 through 13 February 2013
ER -