Kat, a researcher at the Centre for the Study of Global Ethics discusses her work on a three-year EU-funded research project which aims to examine the impact of surveillance technologies on society.
My name’s Kat Hadjimatheou and I’m a researcher here at the Centre for the Study of Global Ethics. I’m working on a three-year EU-funded research project that’s funded by the European Commission and its aim is to examine the impact of surveillance technologies that are used in serious crime, to fight serious crime, and the impact on society. My part of the research will be looking at the ethical impact of such technologies. So for example, the possible invasion into privacy that such technologies present, the possibility for what we call dual use of technologies – they might be designed for one context but they might end up being in another context. A technology might be designed for use in the military and it ends up being by police against organised crime.
Other kinds of issues that we examine include the ethics of exporting technologies. So the ethical issues that are raised when EU-funded research produces technologies that be used by the police, by the military, by border guards patrolling borders. They can be used well, or they can be used in ways that violate people’s human rights. What happens when EU companies sell these kinds of technologies, which are heavily regulated in the EU, to for example Asian countries or countries in the Middle East – there are ethical issues that arise in relation to that.
Now the project has another non-research element which is also quite interesting, and that is an advisory service – an ethical advisory service – and I’ll be working on that – we’re setting that up now. This will be an advisory service where we’ll have video-conferencing, telephone calls, face to face meetings with people who produce technologies – so researchers, business people who make surveillance technologies and then sell them. They will come to us, we hope, for ethical advice on how they can better design such technologies, what kinds of people for example they should be sharing their software with, what kinds of countries they want to be selling to, how they might be able to adapt their technologies to better protect people’s privacy and minimise the risk of human rights violations.