In the 2008 Pixar film Wall-E, humans float on mechanised chairs communicating with each other only through screens. The dystopian premise is that after years of technology improvements, lifestyles have become so sedentary that limbs waste away to the point of decay.

For Professor Sylvie Delacroix, an ethicist in the Law School at the University of Birmingham, the advance of technology does threaten atrophy of the human body. But rather than arms and legs, she is also worried about our ability to question the way things are and take proactive steps towards different futures, when technology encourages us to adopt habits of passive acceptance.

Professor Delacroix has become a prominent critic of an individual-based approach to privacy in technology and an advocate for bottom-up collective methods of managing data. She is among the technologists, lawyers and philosophers taking an interdisciplinary approach to the analysis of data proliferation at the University of Birmingham’s Institute for Interdisciplinary Data Science and AI.

Beyond an individual right to privacy

Every day, an internet user is faced with tiny choices like accepting a website’s cookie policy, with the likely result that they will often accept default settings to save time. This, Professor Delacroix argues, creates an unconscious habit of passivity, where the illusion of choice hides a reality of disempowerment.

“The technologies we're deploying are changing us in a way that is compromising our habit-dependent, pre-reflective intelligence” she argues.

Delacroix has been developing the idea of data trusts to avoid such a dynamic. Rather than expect humans to make a separate choice about how their data is handled in each individual instance, a data trust enables groups to collectively leverage the rights they have over their data to potentially obtain better terms and conditions from service providers, negotiate (and monitor) the terms of data sharing agreements etc. Trusts would dynamically update their policies, and individuals could then switch between them as and when their preferences and aspirations change, rather than making manual changes one at a time.

This approach is a fundamental challenge to the individual, contract-based approach to understanding privacy which, until now, has been transferred wholesale from the analogue world to a digital one with no assessment of its suitability. Technology platforms explicitly ask for consent, which is granted by a click as it would be by signing a traditional contract. But what if it is impossible for an individual to understand the contract they are signing up for?

“I think too much is assumed on the part of individuals when it comes to choices about data,” Professor Delacroix says. “We wouldn't expect people to make informed decisions about healthcare without a doctor and increasingly we need the equivalent for complex data transactions: trusted professionals intermediating between groups of people and platforms.”

The idea of data trusts opens the potential for collective agency over data. For example, the residents of a street, or members of a political community might allow their data to be pooled and made available to improve service provision within their community but withhold it from other organisations.

It also creates the possibility of moving beyond the ‘right to know’ approach to privacy enshrined in rules such as the European Union’s General Data Protection Regulation (GDPR). GDPR requires companies to explain what data they hold and why, but it does not expose to public oversight the algorithms they use to analyse that data, although those algorithms might entrench biases. 

While it is unrealistic for an individual to scrutinise, say, the artificial intelligence techniques of every company that sees their data, a data trust, staffed by experts and empowered by the affiliation of its members, might set out to do so. Such bottom-up movements would help facilitate a shift towards what Professor Delacroix refers to as ‘ensemble contestability’: instead of receiving some explanation tailored to one’s individual query, end users would be put in a position to debate and compare the outputs of differently trained machine learning algorithms, and argue for a specific approach. This would “expose designers of algorithms to a constant feedback loop where end users and the community around them could ask questions that have a collective, longer-term impact,” she says.

“We haven't yet woken up to the fact that our legal instruments, as they exist, are not equipped to tackle the collective dimension of data,” she adds.

‘Digital enchantment’

Professor Karen Yeung, a member of both Birmingham’s Law School and the Institute for Interdisciplinary Data Science and AI, shares with Professor Delacroix a concern for the adverse implications of our ever-more sophisticated data-driven technologies, including the impacts of pervasive and ubiquitous personal data-harvesting.   

“Getting people to care about their data is definitely part of the piece, but these questions will always get de-prioritised given the more urgent and immediate needs that people have,” she says.

Professor Yeung sees a need for governments to intervene more robustly to control how technology companies and other organisations collect and use data. Currently, she says, governments tend to overestimate the benefits of technological innovation while overlooking their unintended adverse effects.

She coined the term ‘digital enchantment’ for a belief among both industry, policy-makers and elected officials that technology will effectively solve problems and that side effects, such as privacy losses, will be either trivial or readily manageable.

“Technology is seen as cool and sexy, so there is a pressure for organisations to be seen to be embracing the latest cutting-edge thing,” Professor Yeung says. “There is a kind of FOMO [a fear of missing out] dynamic, which makes it very hard for policy-makers and governmental actors to be appropriately sceptical.”

Professor Yeung’s research has scrutinised the claims of technology companies to solve specific social problems. For example, she has analysed the growing use of facial recognition technology police (FRT) by London’s Metropolitan Police and found limited evidence that it actually provides any significant benefits claimed by advocates.

“FRT is marketed as a tool through which we will find missing children and catch known terrorists, but there’s no evidence that happens,” she says. “Governments need to really tone down the promise of new technologies instead of selling them to the public on the basis that they will fix all our problems.”

Interdisciplinary and independent

Thinking through the implications of technology for public and private life requires the kind of rigorous, interdisciplinary thinking that a collaborative organisation like the Institute of Data Science was set up to foster.

“Too few philosophers actually bother to make the link between theory and practice and think about the design implications of what they say for everyday life,” says Professor Delacroix, who has recently published work in technology journals, as well as the law and governance journals that are the staple of her discipline.

Holding industry to account also requires a degree of financial independence from the large tech companies that increasingly fund research. So far, the Institute and its academics have been able to attract funding from philanthropic organisations such as the Mozilla Foundation, McGovern Foundation and the Wellcome Trust.

“Academia is one of the few places where there is still an opportunity, if you are properly and independently funded, to speak truth to power, and I think that's absolutely vital,” says Professor Yeung.

Research carried out by members of the Institute is now beginning to directly impact policy and lived experience in the UK. Both Professor Yeung and Professor Delacroix testified as expert witnesses to an inquiry by the UK’s higher legislative chamber, the House of Lords, into police use of technology, for example, while the data trusts initiative has attracted interest and direct engagement from senior levels of government in the UK and abroad.

Explore

Discover more stories about our work and insights from our leading researchers.