How do we process the multisensory world?
A recently published paper has demonstrated that the signals from different senses are partially integrated in parts of the brain that have previously been assumed to process signals from only one sense such as vision.
Imagine you are crossing a street in busy traffic: the motor noise of the trucks, a sparkling motor bike passing by at high speed, the smell of smoke and fumes and the sight of other pedestrians. How do we integrate these signals into a coherent perception of the environment?
The research paper, co-authored by Professor Uta Noppeney, uses brain imaging and advanced analysis methods to demonstrate that the signals from different senses are partially integrated already in primary sensory areas that have previously been assumed to process signals only from one sense. Most importantly, at higher levels of the cortical hierarchy such as in parietal cortices, audiovisual signals were integrated weighted by their reliability and their relevance for task-performance. These higher order areas can thereby form spatial maps that indicate which spatial locations are important and should be prioritised based on signals from different senses and current task demands. These in turn enable us to effectively orient and interact with our multisensory world.
The results provide a novel perspective on the functional organisation of sensory processing in human neocortex.
Distinct Computational Principles Govern Multisensory Integration in Primary Sensory and Association Cortices. Rohe T, Noppeney U. Curr Biol. 2016 Feb 3. pii: S0960-9822(15)01587-0. doi: 10.1016/j.cub.2015.12.056.
Read the paper