Details
Originalsprache | Englisch |
---|---|
Seiten (von - bis) | 404 |
Fachzeitschrift | Frontiers in psychology |
Jahrgang | 9 |
Frühes Online-Datum | 29 März 2018 |
Publikationsstatus | Veröffentlicht - März 2018 |
Abstract
When two individuals interact in a collaborative task, such as carrying a sofa or a table, usually spatiotemporal coordination of individual motor behavior will emerge. In many cases, interpersonal coordination can arise independently of verbal communication, based on the observation of the partners' movements and/or the object's movements. In this study, we investigate how social coupling between two individuals can emerge in a collaborative task under different modes of perceptual information. A visual reference condition was compared with three different conditions with new types of additional auditory feedback provided in real time: effect-based auditory feedback, performance-based auditory feedback, and combined effect/performance-based auditory feedback. We have developed a new paradigm in which the actions of both participants continuously result in a seamlessly merged effect on an object simulated by a tablet computer application. Here, participants should temporally synchronize their movements with a 90° phase difference and precisely adjust the finger dynamics in order to keep the object (a ball) accurately rotating on a given circular trajectory on the tablet. Results demonstrate that interpersonal coordination in a joint task can be altered by different kinds of additional auditory information in various ways.
ASJC Scopus Sachgebiete
Zitieren
- Standard
- Harvard
- Apa
- Vancouver
- BibTex
- RIS
in: Frontiers in psychology, Jahrgang 9, 03.2018, S. 404.
Publikation: Beitrag in Fachzeitschrift › Artikel › Forschung › Peer-Review
}
TY - JOUR
T1 - Effect- and Performance-Based Auditory Feedback on Interpersonal Coordination
AU - Hwang, Tong Hun
AU - Schmitz, Gerd
AU - Klemmt, Kevin
AU - Brinkop, Lukas
AU - Ghai, Shashank
AU - Stoica, Mircea
AU - Maye, Alexander
AU - Blume, Holger
AU - Effenberg, Alfred O.
N1 - FUNDING: The authors acknowledge support by European Commission HORIZON2020-FETPROACT-2014 No. 641321. ACKNOWLEDGMENTS: The publication of this article was funded by the Open Access fund of Leibniz Universität Hannover.
PY - 2018/3
Y1 - 2018/3
N2 - When two individuals interact in a collaborative task, such as carrying a sofa or a table, usually spatiotemporal coordination of individual motor behavior will emerge. In many cases, interpersonal coordination can arise independently of verbal communication, based on the observation of the partners' movements and/or the object's movements. In this study, we investigate how social coupling between two individuals can emerge in a collaborative task under different modes of perceptual information. A visual reference condition was compared with three different conditions with new types of additional auditory feedback provided in real time: effect-based auditory feedback, performance-based auditory feedback, and combined effect/performance-based auditory feedback. We have developed a new paradigm in which the actions of both participants continuously result in a seamlessly merged effect on an object simulated by a tablet computer application. Here, participants should temporally synchronize their movements with a 90° phase difference and precisely adjust the finger dynamics in order to keep the object (a ball) accurately rotating on a given circular trajectory on the tablet. Results demonstrate that interpersonal coordination in a joint task can be altered by different kinds of additional auditory information in various ways.
AB - When two individuals interact in a collaborative task, such as carrying a sofa or a table, usually spatiotemporal coordination of individual motor behavior will emerge. In many cases, interpersonal coordination can arise independently of verbal communication, based on the observation of the partners' movements and/or the object's movements. In this study, we investigate how social coupling between two individuals can emerge in a collaborative task under different modes of perceptual information. A visual reference condition was compared with three different conditions with new types of additional auditory feedback provided in real time: effect-based auditory feedback, performance-based auditory feedback, and combined effect/performance-based auditory feedback. We have developed a new paradigm in which the actions of both participants continuously result in a seamlessly merged effect on an object simulated by a tablet computer application. Here, participants should temporally synchronize their movements with a 90° phase difference and precisely adjust the finger dynamics in order to keep the object (a ball) accurately rotating on a given circular trajectory on the tablet. Results demonstrate that interpersonal coordination in a joint task can be altered by different kinds of additional auditory information in various ways.
KW - Auditory feedback
KW - Collaborative task
KW - Interpersonal coordination
KW - Movement sonification
KW - Sensorimotor contingencies theory
U2 - 10.3389/fpsyg.2018.00404
DO - 10.3389/fpsyg.2018.00404
M3 - Article
C2 - 29651263
AN - SCOPUS:85044835746
VL - 9
SP - 404
JO - Frontiers in psychology
JF - Frontiers in psychology
SN - 1664-1078
ER -