New project creating AI model to help people challenge misinformation wins nearly £1m funding

NeuroCognitive Shield project will work with diverse community groups in Birmingham and use brain mapping techniques to develop artificial intelligence model.

Man sitting down holding his smartphone to read.

A new AI-powered model could help to understand communities that are at the greatest risk of being targeted by misinformation.

False or misleading online stories now travel faster than official facts, eroding public trust in elections, vaccination campaigns, and civic institutions. The challenge is magnified in ‘super-diverse’ cities such as Birmingham, where residents have roots in more than 180 countries and speak well over 100 languages. Messages move through different cultural groups and social networks in different languages, so a correction that reaches one group may never reach another.

To help combat this, a new University of Birmingham project has secured more than £986,000 from UK Research and Innovation (UKRI), the national funding agency. The NeuroCognitive Shield project will build an AI model to help individuals recognise when they are at risk of uncritically accepting or rejecting new information.

The NeuroCognitive Shield project will work with diverse community groups in Birmingham and use state-of-the-art brain mapping techniques to observe, in real time, how individuals from different backgrounds respond to trustworthy and false digital content.

Most existing countermeasures to misinformation treat audiences as if they were identical, but in an age of super-diverse cities with multiple languages spoken and different communities reflected in the population, this one-size-fits-all approach does not reach all areas that it needs to.

Professor René Lindstädt, University of Birmingham

René Lindstädt, Professor of Government and Data Science and project lead, said: “Deepfakes, misinformation, and fake news are more prevalent than ever in our increasingly online world. As a result, countering them is one of the most pressing issues facing modern democracies today. We have already seen the impact that false information can have on healthcare, on politics and in some cases, even resulting in real-world violence.

“Most existing countermeasures to misinformation treat audiences as if they were identical, but in an age of super-diverse cities with multiple languages spoken and different communities reflected in the population, this one-size-fits-all approach does not reach all areas that it needs to.”

The data from the brain mapping exercise will be used to build an AI model that tailors a message to activate the critical thinking part of the recipient’s brain. This will help people recognise when they are at risk of uncritically accepting or rejecting information, also known as a ‘quick-accept’ state.

Professor Slava Jankin, Chair in Data Science and Government at the University of Birmingham and researcher on the project, added: “Our project seeks to address the opening that the diverse nature of our population provides for hostile actors sharing misinformation. Working with our communities in Birmingham to create this AI model will provide a shield for digital users, helping them to identify when they are at risk of accepting suspicious or unfounded claims and encourage them to think more critically about what they are seeing or reading. The best defence against misinformation is an informed and engaged population.”

Working with our communities in Birmingham to create this AI model will provide a shield for digital users, helping them to identify when they are at risk of accepting suspicious or unfounded claims and encourage them to think more critically about what they are seeing or reading.

Professor Slava Jankin, University of Birmingham

The research team will design and test games, messages, and peer-led discussions through workshops with neighbourhood organisations, faith groups, and other local information brokers. This will harness diversity as a protective asset, rather than something that can be exploited by misinformation peddlers.

The project aims to ready democratic societies to identify misinformation confidently, sustaining democratic participation, and position the UK as a leader in trustworthy AI.

As well as Professor Lindstädt and Professor Jankin, the research team will be comprised of multidisciplinary researchers from the University of Birmingham. This includes Jennifer Cook, Professor of Cognitive Neuroscience, Professor Paolo Missier, Director of the Institute for Data and AI, and Matt Bennett, Professor of Social Policy.

Notes for editors

For more information, please contact Ellie Hail, Communications Officer, University of Birmingham at e.hail@bham.ac.uk or alternatively on +44 (0)7966 311 409. You can also contact the press office on +44 (0) 121 414 2772.

About the University of Birmingham

  • The University of Birmingham is ranked amongst the world’s top 100 institutions. Its work brings people from across the world to Birmingham, including researchers, educators and more than 40,000 students from over 150 countries.
  • England’s first civic university, the University of Birmingham, is proud to be rooted in one of the most dynamic and diverse cities in the country. A member of the Russell Group and a founding member of the Universitas 21 global network of research universities, the University of Birmingham has been changing the way the world works for more than a century.
  • The University of Birmingham is committed to achieving operational net zero carbon. It is seeking to change society and the environment positively, and use its research and education to make a major global contribution to the UN Sustainable Development Goals. Find out at birmingham.ac.uk/sustainability.