Details
Original language | English |
---|---|
Article number | 429 |
Pages (from-to) | 429-443 |
Number of pages | 15 |
Journal | Applied Sciences |
Volume | 10 |
Issue number | 2 |
Publication status | Published - 7 Jan 2020 |
Abstract
Reaching movements are usually initiated by visual events and controlled visually and kinesthetically. Lately, studies have focused on the possible benefit of auditory information for localization tasks, and also for movement control. This explorative study aimed to investigate if it is possible to code reaching space purely by auditory information. Therefore, the precision of reaching movements to merely acoustically coded target positions was analyzed. We studied the efficacy of acoustically effect-based and of additional acoustically performance-based instruction and feedback and the role of visual movement control. Twenty-four participants executed reaching movements to merely acoustically presented, invisible target positions in three mutually perpendicular planes in front of them. Effector-endpoint trajectories were tracked using inertial sensors. Kinematic data regarding the three spatial dimensions and the movement velocity were sonified. Thus, acoustic instruction and real-time feedback of the movement trajectories and the target position of the hand were provided. The subjects were able to align their reaching movements to the merely acoustically instructed targets. Reaching space can be coded merely acoustically, additional visual movement control does not enhance reaching performance. On the basis of these results, a remarkable benefit of kinematic movement acoustics for the neuromotor rehabilitation of everyday motor skills can be assumed.
Keywords
- Audition and movement, Kinematic movement sonification, Knowledge of performance, Knowledge of results, Motor control, Neuromotor rehabilitation, Proprioception and movement, Reaching, Sensory-motor integration, Visual-to-auditory substitution
ASJC Scopus subject areas
- Materials Science(all)
- General Materials Science
- Physics and Astronomy(all)
- Instrumentation
- Engineering(all)
- General Engineering
- Chemical Engineering(all)
- Process Chemistry and Technology
- Computer Science(all)
- Computer Science Applications
- Chemical Engineering(all)
- Fluid Flow and Transfer Processes
Cite this
- Standard
- Harvard
- Apa
- Vancouver
- BibTeX
- RIS
In: Applied Sciences, Vol. 10, No. 2, 429, 07.01.2020, p. 429-443.
Research output: Contribution to journal › Article › Research › peer review
}
TY - JOUR
T1 - Auditory Coding of Reaching Space
AU - Fehse, Ursula
AU - Schmitz, Gerd
AU - Hartwig, Daniela
AU - Ghai, Shashank
AU - Brock, Heike
AU - Effenberg, Alfred O.
N1 - Funding information: This researchwas funded by the European RegionalDevelopment Fund (ERDF), project numberW2-80118660.
PY - 2020/1/7
Y1 - 2020/1/7
N2 - Reaching movements are usually initiated by visual events and controlled visually and kinesthetically. Lately, studies have focused on the possible benefit of auditory information for localization tasks, and also for movement control. This explorative study aimed to investigate if it is possible to code reaching space purely by auditory information. Therefore, the precision of reaching movements to merely acoustically coded target positions was analyzed. We studied the efficacy of acoustically effect-based and of additional acoustically performance-based instruction and feedback and the role of visual movement control. Twenty-four participants executed reaching movements to merely acoustically presented, invisible target positions in three mutually perpendicular planes in front of them. Effector-endpoint trajectories were tracked using inertial sensors. Kinematic data regarding the three spatial dimensions and the movement velocity were sonified. Thus, acoustic instruction and real-time feedback of the movement trajectories and the target position of the hand were provided. The subjects were able to align their reaching movements to the merely acoustically instructed targets. Reaching space can be coded merely acoustically, additional visual movement control does not enhance reaching performance. On the basis of these results, a remarkable benefit of kinematic movement acoustics for the neuromotor rehabilitation of everyday motor skills can be assumed.
AB - Reaching movements are usually initiated by visual events and controlled visually and kinesthetically. Lately, studies have focused on the possible benefit of auditory information for localization tasks, and also for movement control. This explorative study aimed to investigate if it is possible to code reaching space purely by auditory information. Therefore, the precision of reaching movements to merely acoustically coded target positions was analyzed. We studied the efficacy of acoustically effect-based and of additional acoustically performance-based instruction and feedback and the role of visual movement control. Twenty-four participants executed reaching movements to merely acoustically presented, invisible target positions in three mutually perpendicular planes in front of them. Effector-endpoint trajectories were tracked using inertial sensors. Kinematic data regarding the three spatial dimensions and the movement velocity were sonified. Thus, acoustic instruction and real-time feedback of the movement trajectories and the target position of the hand were provided. The subjects were able to align their reaching movements to the merely acoustically instructed targets. Reaching space can be coded merely acoustically, additional visual movement control does not enhance reaching performance. On the basis of these results, a remarkable benefit of kinematic movement acoustics for the neuromotor rehabilitation of everyday motor skills can be assumed.
KW - Audition and movement
KW - Kinematic movement sonification
KW - Knowledge of performance
KW - Knowledge of results
KW - Motor control
KW - Neuromotor rehabilitation
KW - Proprioception and movement
KW - Reaching
KW - Sensory-motor integration
KW - Visual-to-auditory substitution
UR - http://www.scopus.com/inward/record.url?scp=85079761986&partnerID=8YFLogxK
U2 - 10.3390/app10020429
DO - 10.3390/app10020429
M3 - Article
VL - 10
SP - 429
EP - 443
JO - Applied Sciences
JF - Applied Sciences
SN - 2076-3417
IS - 2
M1 - 429
ER -