Decoding society with AI

The University of Birmingham is building the future of computational social science - answering the real questions of society.

AI is changing how we decipher society. Politics, identity, conflict, markets and culture now move through vast streams of text, images, audio and video. Speeches, social media posts, interviews, adverts, parliamentary debates, memes and platform traces have become part of the raw material of public life.

The question is no longer whether society leaves data behind. It does, constantly. The harder question is whether we can read that data responsibly, with the depth, caution and imagination social inquiry demands.

That is where computational social science comes in. It uses AI and data science not to replace human judgement, but to sharpen it. This specialism allows researchers to detect patterns too large for any single person to read, compare public debate across languages and platforms, and test claims about power, persuasion, participation and representation at scale.

Computational social science is no longer a niche method for counting words in political speeches. It is becoming one of the core ways we understand public life: how people encounter information, how institutions communicate, how conflicts escalate, how bias enters automated systems, and how democratic debate changes when AI becomes part of the conversation.

Drs Christian Arnold and Martin Wählisch - University of Birmingham

When the University of Birmingham hosted COMPTEXT 2026, it did more than bring an international research community to campus. The conference showed why Birmingham is becoming a natural home for this field. COMPTEXT is the world’s only truly interdisciplinary conference dedicated to computational text analysis across political science, media and communication studies, psychology, computer science, business, sociology and the humanities. It is where methods meet the real questions of society.

The field is moving fast: from large language models and multimodal data to platform regulation, misinformation, sustainability and high-performance computing. COMPTEXT created space for a deep dive on AI and sustainability, including the promise of AI tools for climate action and the uncomfortable paradox that large-scale models carry real environmental costs.

That conversation brought together perspectives from across Birmingham, including the Centre for Artificial Intelligence in Government (CAIG), the Institute for Data and AI (IDAI), and the Birmingham Institute for Sustainability and Climate Action (BISCA), showing how the University connects AI, public policy, sustainability and social science in one shared research agenda.

For Birmingham, that matters. Computational social science is no longer a niche method for counting words in political speeches. It is becoming one of the core ways we understand public life: how people encounter information, how institutions communicate, how conflicts escalate, how bias enters automated systems, and how democratic debate changes when AI becomes part of the conversation.

From innovation to responsibility

Computational social science is moving from methodological innovation to public responsibility. The challenge now is not only whether our models work, but whether they help us understand society with care, transparency and consequence.

Five frontier issues stood out from the COMPTEXT programme.

  • Large language models are transforming research, but they also force harder questions about validity. Sessions examined LLM safety, bias, reproducibility, political text annotation and model disagreement. The message was clear: AI can scale analysis, but scale without validation is just faster uncertainty.
  • The field is becoming multimodal. Text alone no longer captures political communication. Researchers are now analysing images, memes, video, audio and platform-native formats such as TikTok and YouTube. Birmingham’s programme reflected this shift through workshops and panels on multimodal data, vision-language models and video-to-text pipelines.
  • Computational social science needs better infrastructure. Workshops on Python, high-performance computing and workflow management showed that the next generation of research depends on reproducible pipelines, shared tools and access to computing power, not just clever models.
  • Data access and platform governance are now central research problems. The programme included work on the Digital Services Act, platform terms of use, Twitter/X developer policies and researcher access to online platforms. The politics of data access shapes what society can know about itself.
  • AI’s social value must be judged against its costs. COMPTEXT put sustainability on the agenda, including a roundtable on ‘AI for Sustainability: Promise, Paradox, and Responsibility’ and research on tracking the environmental costs of computational analysis. This is exactly the kind of uncomfortable conversation universities should host.

The University of Birmingham’s strength is that it connects the technical frontier with the public-interest questions behind it. Computational social science at Birmingham is about methods, but also democracy, conflict, sustainability and responsibility.

That combination is what makes Birmingham distinctive. The University is not treating AI as a detached technical revolution. Through communities spanning social science, humanities, computer science, business and policy, Birmingham is building the conditions for computational research that is rigorous, open and socially grounded.

Hosting COMPTEXT was a signal. Birmingham is not simply observing the future of computational social science. It is helping shape it.

Dr Christian Arnold and Dr Martin Wählisch are members of CAIG. Both are also Fellows of IDAI, and Dr Wählisch is also affiliated with BISCA. Their work reflects an effort to connect different scientific worlds, from AI and data science to governance, sustainability and social research.