Details
Original language | English |
---|---|
Pages (from-to) | 94-104 |
Number of pages | 11 |
Journal | Brain research |
Volume | 1252 |
Publication status | Published - 3 Feb 2009 |
Externally published | Yes |
Abstract
Audio-visual integration in the human brain influences perception and precision of motor tasks. We tested audio-visual integration during height estimation when presenting video clips of counter movement jumps (CMJ), using sparse sampling fMRI at 3T. Employing the technique of "sonification", we created artificial auditory-visual motion events by transforming the ground reaction force of the CMJs into the auditory domain, modulating frequency and amplitude of the standard pitch "A" (440 Hz). We combined these "sonificated" movements with either concordant or discordant visual movement displays. We hypothesized that processing of concordant audio-visual stimuli would enhance neural activity in audio-visual integration areas. Therefore, four conditions were compared: 1. unimodal visual, 2. unimodal auditory, 3. auditory + visual concordant, and 4. auditory +visual discordant. The unimodal conditions, when compared against each other, resulted in expected activation maxima in primary visual and auditory cortex, respectively. Enhanced activation was found in area V5/MT bilaterally for the concordant multimodal, as compared to both unimodal, conditions. This effect was specific for the concordant bimodal condition, as evidenced by a direct comparison between concordant and discordant bimodal conditions. Using "sonification", we provide evidence that area V5/MT is modulated by concordant auditory input, albeit the artificial nature of the stimuli, which argues for a role of this region in multimodal motion integration, beyond the pure visual domain. This may explain previous behavioral evidence of facilitatory effects exerted by auditory motion stimuli on the perception of visual motion, and may provide the basis for future applications in motor learning and rehabilitation.
Keywords
- Area V5/MT, Audio-visual, fMRI, Multisensory integration, Sonification, Sparse sampling
ASJC Scopus subject areas
- Neuroscience(all)
- Biochemistry, Genetics and Molecular Biology(all)
- Molecular Biology
- Medicine(all)
- Clinical Neurology
- Biochemistry, Genetics and Molecular Biology(all)
- Developmental Biology
Cite this
- Standard
- Harvard
- Apa
- Vancouver
- BibTeX
- RIS
In: Brain research, Vol. 1252, 03.02.2009, p. 94-104.
Research output: Contribution to journal › Article › Research › peer review
}
TY - JOUR
T1 - Multimodal motion processing in area V5/MT
T2 - Evidence from an artificial class of audio-visual events
AU - Scheef, Lukas
AU - Boecker, Henning
AU - Daamen, Marcel
AU - Fehse, Ursula
AU - Landsberg, Martin W.
AU - Granath, Dirk Oliver
AU - Mechling, Heinz
AU - Effenberg, Alfred O.
PY - 2009/2/3
Y1 - 2009/2/3
N2 - Audio-visual integration in the human brain influences perception and precision of motor tasks. We tested audio-visual integration during height estimation when presenting video clips of counter movement jumps (CMJ), using sparse sampling fMRI at 3T. Employing the technique of "sonification", we created artificial auditory-visual motion events by transforming the ground reaction force of the CMJs into the auditory domain, modulating frequency and amplitude of the standard pitch "A" (440 Hz). We combined these "sonificated" movements with either concordant or discordant visual movement displays. We hypothesized that processing of concordant audio-visual stimuli would enhance neural activity in audio-visual integration areas. Therefore, four conditions were compared: 1. unimodal visual, 2. unimodal auditory, 3. auditory + visual concordant, and 4. auditory +visual discordant. The unimodal conditions, when compared against each other, resulted in expected activation maxima in primary visual and auditory cortex, respectively. Enhanced activation was found in area V5/MT bilaterally for the concordant multimodal, as compared to both unimodal, conditions. This effect was specific for the concordant bimodal condition, as evidenced by a direct comparison between concordant and discordant bimodal conditions. Using "sonification", we provide evidence that area V5/MT is modulated by concordant auditory input, albeit the artificial nature of the stimuli, which argues for a role of this region in multimodal motion integration, beyond the pure visual domain. This may explain previous behavioral evidence of facilitatory effects exerted by auditory motion stimuli on the perception of visual motion, and may provide the basis for future applications in motor learning and rehabilitation.
AB - Audio-visual integration in the human brain influences perception and precision of motor tasks. We tested audio-visual integration during height estimation when presenting video clips of counter movement jumps (CMJ), using sparse sampling fMRI at 3T. Employing the technique of "sonification", we created artificial auditory-visual motion events by transforming the ground reaction force of the CMJs into the auditory domain, modulating frequency and amplitude of the standard pitch "A" (440 Hz). We combined these "sonificated" movements with either concordant or discordant visual movement displays. We hypothesized that processing of concordant audio-visual stimuli would enhance neural activity in audio-visual integration areas. Therefore, four conditions were compared: 1. unimodal visual, 2. unimodal auditory, 3. auditory + visual concordant, and 4. auditory +visual discordant. The unimodal conditions, when compared against each other, resulted in expected activation maxima in primary visual and auditory cortex, respectively. Enhanced activation was found in area V5/MT bilaterally for the concordant multimodal, as compared to both unimodal, conditions. This effect was specific for the concordant bimodal condition, as evidenced by a direct comparison between concordant and discordant bimodal conditions. Using "sonification", we provide evidence that area V5/MT is modulated by concordant auditory input, albeit the artificial nature of the stimuli, which argues for a role of this region in multimodal motion integration, beyond the pure visual domain. This may explain previous behavioral evidence of facilitatory effects exerted by auditory motion stimuli on the perception of visual motion, and may provide the basis for future applications in motor learning and rehabilitation.
KW - Area V5/MT
KW - Audio-visual
KW - fMRI
KW - Multisensory integration
KW - Sonification
KW - Sparse sampling
UR - http://www.scopus.com/inward/record.url?scp=58349097927&partnerID=8YFLogxK
U2 - 10.1016/j.brainres.2008.10.067
DO - 10.1016/j.brainres.2008.10.067
M3 - Article
C2 - 19083992
AN - SCOPUS:58349097927
VL - 1252
SP - 94
EP - 104
JO - Brain research
JF - Brain research
SN - 0006-8993
ER -