Multimodal motion processing in area V5/MT: Evidence from an artificial class of audio-visual events

Research output: Contribution to journalArticleResearchpeer review

Authors

  • Lukas Scheef
  • Henning Boecker
  • Marcel Daamen
  • Ursula Fehse
  • Martin W. Landsberg
  • Dirk Oliver Granath
  • Heinz Mechling
  • Alfred O. Effenberg

External Research Organisations

  • University of Bonn
View graph of relations

Details

Original languageEnglish
Pages (from-to)94-104
Number of pages11
JournalBrain research
Volume1252
Publication statusPublished - 3 Feb 2009
Externally publishedYes

Abstract

Audio-visual integration in the human brain influences perception and precision of motor tasks. We tested audio-visual integration during height estimation when presenting video clips of counter movement jumps (CMJ), using sparse sampling fMRI at 3T. Employing the technique of "sonification", we created artificial auditory-visual motion events by transforming the ground reaction force of the CMJs into the auditory domain, modulating frequency and amplitude of the standard pitch "A" (440 Hz). We combined these "sonificated" movements with either concordant or discordant visual movement displays. We hypothesized that processing of concordant audio-visual stimuli would enhance neural activity in audio-visual integration areas. Therefore, four conditions were compared: 1. unimodal visual, 2. unimodal auditory, 3. auditory + visual concordant, and 4. auditory +visual discordant. The unimodal conditions, when compared against each other, resulted in expected activation maxima in primary visual and auditory cortex, respectively. Enhanced activation was found in area V5/MT bilaterally for the concordant multimodal, as compared to both unimodal, conditions. This effect was specific for the concordant bimodal condition, as evidenced by a direct comparison between concordant and discordant bimodal conditions. Using "sonification", we provide evidence that area V5/MT is modulated by concordant auditory input, albeit the artificial nature of the stimuli, which argues for a role of this region in multimodal motion integration, beyond the pure visual domain. This may explain previous behavioral evidence of facilitatory effects exerted by auditory motion stimuli on the perception of visual motion, and may provide the basis for future applications in motor learning and rehabilitation.

Keywords

    Area V5/MT, Audio-visual, fMRI, Multisensory integration, Sonification, Sparse sampling

ASJC Scopus subject areas

Cite this

Multimodal motion processing in area V5/MT: Evidence from an artificial class of audio-visual events. / Scheef, Lukas; Boecker, Henning; Daamen, Marcel et al.
In: Brain research, Vol. 1252, 03.02.2009, p. 94-104.

Research output: Contribution to journalArticleResearchpeer review

Scheef, L, Boecker, H, Daamen, M, Fehse, U, Landsberg, MW, Granath, DO, Mechling, H & Effenberg, AO 2009, 'Multimodal motion processing in area V5/MT: Evidence from an artificial class of audio-visual events', Brain research, vol. 1252, pp. 94-104. https://doi.org/10.1016/j.brainres.2008.10.067
Scheef, L., Boecker, H., Daamen, M., Fehse, U., Landsberg, M. W., Granath, D. O., Mechling, H., & Effenberg, A. O. (2009). Multimodal motion processing in area V5/MT: Evidence from an artificial class of audio-visual events. Brain research, 1252, 94-104. https://doi.org/10.1016/j.brainres.2008.10.067
Scheef L, Boecker H, Daamen M, Fehse U, Landsberg MW, Granath DO et al. Multimodal motion processing in area V5/MT: Evidence from an artificial class of audio-visual events. Brain research. 2009 Feb 3;1252:94-104. doi: 10.1016/j.brainres.2008.10.067
Scheef, Lukas ; Boecker, Henning ; Daamen, Marcel et al. / Multimodal motion processing in area V5/MT : Evidence from an artificial class of audio-visual events. In: Brain research. 2009 ; Vol. 1252. pp. 94-104.
Download
@article{03fed45e594a47058d811adefd5c65cb,
title = "Multimodal motion processing in area V5/MT: Evidence from an artificial class of audio-visual events",
abstract = "Audio-visual integration in the human brain influences perception and precision of motor tasks. We tested audio-visual integration during height estimation when presenting video clips of counter movement jumps (CMJ), using sparse sampling fMRI at 3T. Employing the technique of {"}sonification{"}, we created artificial auditory-visual motion events by transforming the ground reaction force of the CMJs into the auditory domain, modulating frequency and amplitude of the standard pitch {"}A{"} (440 Hz). We combined these {"}sonificated{"} movements with either concordant or discordant visual movement displays. We hypothesized that processing of concordant audio-visual stimuli would enhance neural activity in audio-visual integration areas. Therefore, four conditions were compared: 1. unimodal visual, 2. unimodal auditory, 3. auditory + visual concordant, and 4. auditory +visual discordant. The unimodal conditions, when compared against each other, resulted in expected activation maxima in primary visual and auditory cortex, respectively. Enhanced activation was found in area V5/MT bilaterally for the concordant multimodal, as compared to both unimodal, conditions. This effect was specific for the concordant bimodal condition, as evidenced by a direct comparison between concordant and discordant bimodal conditions. Using {"}sonification{"}, we provide evidence that area V5/MT is modulated by concordant auditory input, albeit the artificial nature of the stimuli, which argues for a role of this region in multimodal motion integration, beyond the pure visual domain. This may explain previous behavioral evidence of facilitatory effects exerted by auditory motion stimuli on the perception of visual motion, and may provide the basis for future applications in motor learning and rehabilitation.",
keywords = "Area V5/MT, Audio-visual, fMRI, Multisensory integration, Sonification, Sparse sampling",
author = "Lukas Scheef and Henning Boecker and Marcel Daamen and Ursula Fehse and Landsberg, {Martin W.} and Granath, {Dirk Oliver} and Heinz Mechling and Effenberg, {Alfred O.}",
year = "2009",
month = feb,
day = "3",
doi = "10.1016/j.brainres.2008.10.067",
language = "English",
volume = "1252",
pages = "94--104",
journal = "Brain research",
issn = "0006-8993",
publisher = "Elsevier",

}

Download

TY - JOUR

T1 - Multimodal motion processing in area V5/MT

T2 - Evidence from an artificial class of audio-visual events

AU - Scheef, Lukas

AU - Boecker, Henning

AU - Daamen, Marcel

AU - Fehse, Ursula

AU - Landsberg, Martin W.

AU - Granath, Dirk Oliver

AU - Mechling, Heinz

AU - Effenberg, Alfred O.

PY - 2009/2/3

Y1 - 2009/2/3

N2 - Audio-visual integration in the human brain influences perception and precision of motor tasks. We tested audio-visual integration during height estimation when presenting video clips of counter movement jumps (CMJ), using sparse sampling fMRI at 3T. Employing the technique of "sonification", we created artificial auditory-visual motion events by transforming the ground reaction force of the CMJs into the auditory domain, modulating frequency and amplitude of the standard pitch "A" (440 Hz). We combined these "sonificated" movements with either concordant or discordant visual movement displays. We hypothesized that processing of concordant audio-visual stimuli would enhance neural activity in audio-visual integration areas. Therefore, four conditions were compared: 1. unimodal visual, 2. unimodal auditory, 3. auditory + visual concordant, and 4. auditory +visual discordant. The unimodal conditions, when compared against each other, resulted in expected activation maxima in primary visual and auditory cortex, respectively. Enhanced activation was found in area V5/MT bilaterally for the concordant multimodal, as compared to both unimodal, conditions. This effect was specific for the concordant bimodal condition, as evidenced by a direct comparison between concordant and discordant bimodal conditions. Using "sonification", we provide evidence that area V5/MT is modulated by concordant auditory input, albeit the artificial nature of the stimuli, which argues for a role of this region in multimodal motion integration, beyond the pure visual domain. This may explain previous behavioral evidence of facilitatory effects exerted by auditory motion stimuli on the perception of visual motion, and may provide the basis for future applications in motor learning and rehabilitation.

AB - Audio-visual integration in the human brain influences perception and precision of motor tasks. We tested audio-visual integration during height estimation when presenting video clips of counter movement jumps (CMJ), using sparse sampling fMRI at 3T. Employing the technique of "sonification", we created artificial auditory-visual motion events by transforming the ground reaction force of the CMJs into the auditory domain, modulating frequency and amplitude of the standard pitch "A" (440 Hz). We combined these "sonificated" movements with either concordant or discordant visual movement displays. We hypothesized that processing of concordant audio-visual stimuli would enhance neural activity in audio-visual integration areas. Therefore, four conditions were compared: 1. unimodal visual, 2. unimodal auditory, 3. auditory + visual concordant, and 4. auditory +visual discordant. The unimodal conditions, when compared against each other, resulted in expected activation maxima in primary visual and auditory cortex, respectively. Enhanced activation was found in area V5/MT bilaterally for the concordant multimodal, as compared to both unimodal, conditions. This effect was specific for the concordant bimodal condition, as evidenced by a direct comparison between concordant and discordant bimodal conditions. Using "sonification", we provide evidence that area V5/MT is modulated by concordant auditory input, albeit the artificial nature of the stimuli, which argues for a role of this region in multimodal motion integration, beyond the pure visual domain. This may explain previous behavioral evidence of facilitatory effects exerted by auditory motion stimuli on the perception of visual motion, and may provide the basis for future applications in motor learning and rehabilitation.

KW - Area V5/MT

KW - Audio-visual

KW - fMRI

KW - Multisensory integration

KW - Sonification

KW - Sparse sampling

UR - http://www.scopus.com/inward/record.url?scp=58349097927&partnerID=8YFLogxK

U2 - 10.1016/j.brainres.2008.10.067

DO - 10.1016/j.brainres.2008.10.067

M3 - Article

C2 - 19083992

AN - SCOPUS:58349097927

VL - 1252

SP - 94

EP - 104

JO - Brain research

JF - Brain research

SN - 0006-8993

ER -

By the same author(s)