The goal of the Computational Cognitive Neuroimaging Group is to unravel the neural mechanisms and computational operations that enable the human brain to acquire, represent and retrieve knowledge about our multisensory environment.
Integration of multiple information streams is critical for an organism to interact effectively with its dynamic environment. In everyday life, we are exposed to a constant influx of sensory signals that provide uncertain information about the state of the external environment. We integrate intrinsic prior knowledge with that gained from past experiences to constrain the interpretation of incoming signals. Information integration is pervasive across all stages of perception and action. During perception, information abstracted from the sensory inputs needs to be integrated within and across the senses to form a coherent and reliable percept of the environment to guide actions. Within the cortical hierarchy, multisensory perception emerges in an interactive process with top-down prior information constraining the interpretation of the incoming sensory signals.
Current research questions:
Where and how are different types of sensory features combined at multiple levels of the cortical hierarchy? Are information integration processes governed by distinct computational principles across the cortical hierarchy?
How does the brain arbitrate between information integration and segregation? How does it implement Bayesian Causal Inference at the neural systems level?
How is the relative timing (e.g. synchrony) of multiple inputs coded across sensory modalities?
Is information integration automatic or dependent on attention, awareness and vigilance?
Which factors determine inter-trial and inter-subject variability in multi-sensory integration?
How does multisensory integration dynamically adapt to the statistical structure of our dynamic environment at multiple timescales ranging from seconds to years?
How does ageing affect information integration and segregation? Can multisensory integration help to compensate for age-related changes?
To address these questions, we employ a challenging multidisciplinary approach combining the complementary strengths of psychophysics, functional imaging (fMRI, M/EEG), perturbation approaches such as concurrent TMS-fMRI and neuropsychological studies in patients. Further, we characterize the response properties and the information content (e.g. advanced multivariate pattern analysis) of individual regions and establish the functional and effective connectivity between regions. To gain a more informed perspective on the underlying computational and neural mechanisms, we combine functional imaging with models of Bayesian inference and learning.
Rohe T, Noppeney U (in press) Cortical hierarchies perform Bayesian Causal Inference for Multisensory Perception. PLOS Biology.
Lee HL, Noppeney U (2014) Temporal prediction errors in auditory and visual cortices. Current Biology. 24 (8): R309-10.
Von Saldern S, Noppeney U (2013) Sensory and striatal areas integrate auditory and visual signals into behavioural benefits during motion discrimination. J Neurosci. 33(20):8841-9.
Leitão J, Thielscher A, Werner S, Pohmann R, Noppeney U (2013) Effects of TMS to parietal cortex on visual and auditory processing at the primary cortical level. Cerebral Cortex. 23(4):873-84.
Conrad V, Vitello MP, Noppeney U (2012) Interactions between apparent motion rivalry in vision and touch. Psychological Science. 23(8):940-8.
Lee HL, Noppeney U (2011) Long-term music training tunes how the brain temporally binds signals from multiple senses. PNAS. 108(51): E1441-50.
Giani AS, Ortiz EB, Belardinelli P, Kleiner M, Preissl H, Noppeney U(2012) Using steady-state responses in MEG to study information integration within and across auditory and visual senses. Neuroimage. 60(2):1478-89.
Lee HL, Noppeney U (2011) Physical and perceptual factors shape the neural mechanisms that integrate audiovisual signals in speech comprehension. Journal of Neuroscience. 31(31):11338-50.
Maier JX, Lucas M, Noppeney U (2011) Audio-visual synchrony detection in speech processing. Journal of Experimental Psychology: Human Perception and Performance. 37(1):245-56.
Lewis RK, Noppeney U (2010) Audiovisual Synchrony Improves Motion Discrimination via Enhanced Connectivity between Early Visual and Auditory Areas. Journal of Neuroscience. 30(37):12329-39.
Noppeney U, Ostwald D, Werner S (2010) Perceptual decisions formed by accumulation of audiovisual evidence in prefrontal cortex. Journal of Neuroscience. 30(21):7437-46.
Werner S, Noppeney U (2010) Distinct functional contributions of primary sensory and association areas to audiovisual integration in object categorization. Journal of Neuroscience. 30(7):2662-75.
Werner S, Noppeney U (2010) Superadditive responses in superior temporal sulcus predict audiovisual benefits in object categorization. Cerebral Cortex. 20(8):1829-42.
Sadaghiani S, Maier JX, Noppeney U (2009) Naturalistic, metaphoric and linguistic audio-visual interactions. Journal of Neuroscience. 29(20):6490-9.
Noppeney U, Josephs O, Hocking J, Price CJ, Friston KJ (2008) The effect of prior visual information on recognition of sounds and speech. Cerebral Cortex.18(3):598-609.
Noppeney U, Patterson K, Tyler LK, Moss H, Stamatakis EA, Bright P, Mummery C, Price CJ (2007) Temporal lobe lesions and semantic impairment: A comparison of herpes simplex virus encephalitis and semantic dementia. Brain. 130(Pt 4):1138-47
Noppeney U, Friston KJ, Ashburner J, Frackowiak R, Price CJ (2005) Early visual deprivation induces structural plasticity in gray and white matter. Current Biology. 15 (13): R488-90.