Professor Uta Noppeney PhD

 

Chair in Computational Neuroscience

School of Psychology

Uta Noppeney

Contact details

School of Psychology
University of Birmingham
Edgbaston
Birmingham
B15 2TT
UK

About

Professor Noppeney investigates how the human brain integrates information across the senses to interact effectively with our complex dynamic environment. To characterize the neural and computational mechanisms the Computational Cognitive Neuroimaging Group combines psychophysics, functional imaging (fMRI & EEG/MEG) and computational modelling.

Uta Noppeney is director of the Computational Neuroscience & Cognitive Robotics Centre at the University of Birmingham.

Qualifications

  • Final Medical State Exams (Freiburg University, Germany 1997)
  • Dr. med. (Freiburg University, Germany, 1998)
  • Ph. D. (University College London, 2004)

Biography

Uta Noppeney studied medicine and philosophy at Freiburg University, University College London and Johns Hopkins University. Her medical doctoral thesis explored how neuroscientific concepts emerged through interactions between neuroscience, philosophy and psychology at the beginning of the 20th century. After training in neurology at the University Hospital in Aachen, she researched at Magdeburg University and the Wellcome Trust Centre for Neuroimaging. Her PhD investigated the neural basis of semantic and language processing using neuroimaging. In 2005, she moved to the Max Planck Institute for Biological Cybernetics in Tübingen. In 2011, she joined the Computational Neuroscience & Cognitive Robotics Centreat the University of Birmingham. Her current research focuses on multisensory perception, adaptation and learning.

Teaching

Uta Noppeney lectures the Advanced Neuroimaging course in the MSc for Neuroimaging and also teaches on the CNCR Foundation and Mind, Brain and Models courses of the MSc in Computational Neuroscience and Cognitive Robotics.

Postgraduate supervision

Uta Noppeney is interested in supervising graduate students in topics related to multisensory integration and categorization. Knowledge in cognitive neuroscience, prior experience in functional imaging and a strong quantitative background would be an advantage.

PhD opportunities

Research

Research interests

The goal of the Computational Cognitive Neuroimaging Group is to unravel the neural mechanisms and computational operations that enable the human brain to acquire, represent and retrieve knowledge about our multisensory environment.

Integration of multiple information streams is critical for an organism to interact effectively with its dynamic environment. In everyday life, we are exposed to a constant influx of sensory signals that provide uncertain information about the state of the external environment. We integrate intrinsic prior knowledge with that gained from past experiences to constrain the interpretation of incoming signals. Information integration is pervasive across all stages of perception and action. During perception, information abstracted from the sensory inputs needs to be integrated within and across the senses to form a coherent and reliable percept of the environment to guide actions. Within the cortical hierarchy, multisensory perception emerges in an interactive process with top-down prior information constraining the interpretation of the incoming sensory signals.

Current research questions:

  • Where and how are different types of sensory features combined at multiple levels of the cortical hierarchy? Are information integration processes governed by distinct computational principles across the cortical hierarchy?
  • How does the brain arbitrate between information integration and segregation? How does it implement Bayesian Causal Inference at the neural systems level?
  • How is the relative timing (e.g. synchrony) of multiple inputs coded across sensory modalities?
  • Is information integration automatic or dependent on attention, awareness and vigilance?
  • Which factors determine inter-trial and inter-subject variability in multi-sensory integration?
  • How does multisensory integration dynamically adapt to the statistical structure of our dynamic environment at multiple timescales ranging from seconds to years?
  • How does ageing affect information integration and segregation? Can multisensory integration help to compensate for age-related changes?

To address these questions, we employ a challenging multidisciplinary approach combining the complementary strengths of psychophysics, functional imaging (fMRI, M/EEG), perturbation approaches such as concurrent TMS-fMRI and neuropsychological studies in patients. Further, we characterize the response properties and the information content (e.g. advanced multivariate pattern analysis) of individual regions and establish the functional and effective connectivity between regions. To gain a more informed perspective on the underlying computational and neural mechanisms, we combine functional imaging with models of Bayesian inference and learning.

Other activities

Uta Noppeney serves on the editorial boards of Journal of Neuroscience, NeuroImage and Frontiers in Integrative Neuroscience.

Publications

Lee HL, Noppeney U (in press) Temporal prediction errors in visual and auditory cortices. Current Biology.

Von Saldern S, Noppeney U (2013) Sensory and striatal areas integrate auditory and visual signals into behavioural benefits during motion discrimination. J Neurosci. 33(20):8841-9.

Leitão J, Thielscher A, Werner S, Pohmann R, Noppeney U (2013) Effects of TMS to parietal cortex on visual and auditory processing at the primary cortical level. Cerebral Cortex. 23(4):873-84.

Conrad V, Vitello MP, Noppeney U (2012) Interactions between apparent motion rivalry in vision and touch. Psychological Science. 23(8):940-8.

Lee HL, Noppeney U (2011) Long-term music training tunes how the brain temporally binds signals from multiple senses. PNAS. 108(51): E1441-50.

Giani AS, Ortiz EB, Belardinelli P, Kleiner M, Preissl H, Noppeney U(2012) Using steady-state responses in MEG to study information integration within and across auditory and visual senses. Neuroimage. 60(2):1478-89.

Lee HL, Noppeney U (2011) Physical and perceptual factors shape the neural mechanisms that integrate audiovisual signals in speech comprehension. Journal of Neuroscience. 31(31):11338-50.

Maier JX, Lucas M, Noppeney U (2011) Audio-visual synchrony detection in speech processing. Journal of Experimental Psychology: Human Perception and Performance. 37(1):245-56.

Lewis RK, Noppeney U (2010) Audiovisual Synchrony Improves Motion Discrimination via Enhanced Connectivity between Early Visual and Auditory Areas. Journal of Neuroscience. 30(37):12329-39.

Noppeney U, Ostwald D, Werner S (2010) Perceptual decisions formed by accumulation of audiovisual evidence in prefrontal cortex. Journal of Neuroscience. 30(21):7437-46.

Werner S, Noppeney U (2010) Distinct functional contributions of primary sensory and association areas to audiovisual integration in object categorization. Journal of Neuroscience. 30(7):2662-75.

Werner S, Noppeney U (2010) Superadditive responses in superior temporal sulcus predict audiovisual benefits in object categorization. Cerebral Cortex. 20(8):1829-42.

Sadaghiani S, Maier JX, Noppeney U (2009) Naturalistic, metaphoric and linguistic audio-visual interactions. Journal of Neuroscience. 29(20):6490-9.

Noppeney U, Josephs O, Hocking J, Price CJ, Friston KJ (2008) The effect of prior visual information on recognition of sounds and speech. Cerebral Cortex.18(3):598-609.

Noppeney U, Patterson K, Tyler LK, Moss H, Stamatakis EA, Bright P, Mummery C, Price CJ (2007) Temporal lobe lesions and semantic impairment: A comparison of herpes simplex virus encephalitis and semantic dementia. Brain. 130(Pt 4):1138-47

Noppeney U, Friston KJ, Ashburner J, Frackowiak R, Price CJ (2005) Early visual deprivation induces structural plasticity in gray and white matter. Current Biology. 15 (13): R488-90.

Back to top