The goal of the Computational Cognitive Neuroimaging Group is to better understand the neural systems that allow us to acquire, represent and retrieve knowledge about our multisensory environment. We address this central research question from three complementary perspectives: (1) Multisensory Integration, (2) Language and (3) Concept Learning.
Our current research focuses primarily on how the human brain integrates information from multiple senses with prior knowledge to form a coherent and more reliable percept of its environment. Within the cortical hierarchy, multisensory perception emerges in an interactive process with top-down prior information constraining the interpretation of the incoming sensory signals.
Therefore, our approach is to characterize the response properties of individual regions and to establish the functional and effective connectivity between regions. We combine the complementary strengths of psychophysics, functional imaging (fMRI, M/EEG), perturbation approaches such as concurrent TMS-fMRI and neuropsychological studies in patients. To gain a more informed perspective on the underlying computational and neural mechanisms, we combine functional imaging with models of Bayesian inference and learning.
Conrad V, Vitello MP, Noppeney U (in press) Interactions between apparent motion rivalry in vision and touch. Psychological Science
Lee HL, Noppeney U (2011) Long-term music training tunes how the brain temporally binds signals from multiple senses. PNAS. 108(51): E1441-50.
Giani AS, Ortiz EB, Belardinelli P, Kleiner M, Preissl H, Noppeney U(in press) Using steady-state responses in MEG to study information integration within and across auditory and visual senses. Neuroimage.
Lee HL, Noppeney U (2011) Physical and perceptual factors shape the neural mechanisms that integrate audiovisual signals in speech comprehension. Journal of Neuroscience. 31(31):11338-50.
Maier JX, Lucas M, Noppeney U (2011) Audio-visual synchrony detection in speech processing. Journal of Experimental Psychology: Human Perception and Performance. 37(1):245-56.
Lewis RK, Noppeney U (2010) Audiovisual Synchrony Improves Motion Discrimination via Enhanced Connectivity between Early Visual and Auditory Areas. Journal of Neuroscience. 30(37):12329-39.
Noppeney U, Ostwald D, Werner S (2010) Perceptual decisions formed by accumulation of audiovisual evidence in prefrontal cortex. Journal of Neuroscience. 30(21):7437-46.
Werner S, Noppeney U (2010) Distinct functional contributions of primary sensory and association areas to audiovisual integration in object categorization. Journal of Neuroscience. 30(7):2662-75.
Werner S, Noppeney U (2010) Superadditive responses in superior temporal sulcus predict audiovisual benefits in object categorization. Cerebral Cortex. 20(8):1829-42.
Sadaghiani S, Maier JX, Noppeney U (2009) Naturalistic, metaphoric and linguistic audio-visual interactions. Journal of Neuroscience. 29(20):6490-9.
Noppeney U, Josephs O, Hocking J, Price CJ, Friston KJ (2008) The effect of prior visual information on recognition of sounds and speech. Cerebral Cortex.18(3):598-609.
Noppeney U, Patterson K, Tyler LK, Moss H, Stamatakis EA, Bright P, Mummery C, Price CJ (2007) Temporal lobe lesions and semantic impairment: A comparison of herpes simplex virus encephalitis and semantic dementia. Brain. 130(Pt 4):1138-47
Noppeney U, Friston KJ, Ashburner J, Frackowiak R, Price CJ (2005) Early visual deprivation induces structural plasticity in gray and white matter. Current Biology. 15 (13): R488-90.