Track, trace and contain – but don't keep our data: Ethical and legal worries of the NHSX App

Coronavirus_red_720
The stated aim of the App is to “minimise the spread of Covid-19 and move towards safely reducing lockdown measures.”

On Thursday 7 May the UK Government began the roll out of its Covid-19 Contact Tracing App on the Isle of Wight. The stated aim of the App is to “minimise the spread of Covid-19 and move towards safely reducing lockdown measures.”

To make the App work, you simply download it and enter your postcode. The App operates in the background and identifies other App users in the near vicinity. It ‘remembers’ and logs the contact for a 28 day period. If you develop symptoms you enter the details into the App and these are recorded (and stored) in a central NHS England database. These details are analysed and if COVID-19 is suspected you will be required to self-isolate for 14 days and told who to contact to obtain a COVID-19 swab test. Anyone who has been in contact and deemed to be “High Risk” will be notified via the App and also asked to self-isolate for a 14 day period. If you test positive, as well as continued self-isolation, a team from Public Health England will contact you to identify other individuals who may have been infected. If the test is negative those who have already been asked to self-isolate will be told they no longer need to.

Any worries we have are supposedly allayed by reassurances about the ethical principles being followed. The creation of the App has been supported by an Ethics Advisory Board which is providing “conditional support “on the information we have available to us at this point in time”. The Board has set out six key principles which the government has this month accepted:

  1. Value (“sufficient value back to the citizen and society to justify its introduction and any adverse consequences to individuals”)
  2. Impact (it “will be an effective tool”)
  3. Security and Privacy (“Data sharing and storage should be secure. The data collected should be minimised and protected so users privacy preserved.”)
  4. Accountability (“clear democratic accountability, particularly with regards to introducing new functionality, data collection or use cases”)
  5. Transparency (“details of what data is gathered and why, as well as the app’s code and underlying algorithms”)
  6. Control (“users should be able to see what data is held about them so they can understand how it is impacting on decisions”).

However, these principles are general and ambiguous and the Government’s response to the Ethics Committee to date provides little reassurance. There are major ethical and legal concerns particularly around privacy, data use and that we ‘volunteer’ to use the App is almost being regarded as ‘consent’, even though there is no formal informed consent.  At least three of the principles are intended to reassure us on these points: Security and Privacy, Transparency and Control. But while we are encouraged to sign up to the App to contain the virus, it appears that there are other uses, if not in train already envisaged. It looks likely that the data will be held and repurposed for future research. For example, under the ‘Security and Privacy’ principle the government commits that:

  • “Agreement will be sought from people who are willing to donate their data for research
  • The data that is donated for research will only be made available to those who have been approved by the NHS”

Likewise, Matthew Gould (the Chief Executive of NHSX) in evidence to the Joint Committee on Human Rights Inquiry said that the information could “be retained for research in the public interest or for use by the NHS for planning and delivering services”. As the Joint Committee noted this gives rise for subsequent use by not only Government but also by the private sector.

That this is data collection and goes beyond virus containment, is further implied by the use of the word ‘donate’. If you donate, you give up control. Donating your data is not like giving money – it is potentially identifying (especially given the postcode requirement). What will be the subsequent use of the data- that matter of personal control. Those using the App currently do not have clear information as to precisely what it will be used for or  indeed what harm might accrue in the future from its use. It can be argued that we are being induced to donate ‘to help the NHS’. Which of us doesn’t want to do that? But if really what we are doing is giving our personal data for unknown future research, by unknown researchers, and without very clear additional safeguards and ethical mechanisms this is unethical.

If it is the case that there are already plans to use the data collected (and not for virus tracing) then it’s not enough to assure us that ‘it’s voluntary’. People voluntarily signing up to an App to track and trace the virus is not volunteering to have their data used for unknown future research. Usually when researchers are seeking to do ethical research they rely on informed consent. For consent to be valid it has to be informed. According to the Declaration of Helsinki, which governs ethical research, for consent to be informed the participant must be “informed of the aims, methods, sources of funding, any possible conflicts of interest, institutional affiliations of the researcher, the anticipated benefits and potential risks of the study and the discomfort it may entail, post-study provisions and any other relevant aspects of the study.” It is puzzling, even startling, that the word ‘consent’ does not appear once in the government’s response. The omission of ‘consent’ from the document might be deliberate – as clearly the standards for valid consent cannot be met – but even if deliberate it is startling not to mention consent when research on data is discussed.

The Ethics Advisory Board does mention consent in its “Public Trust Matrix for use in considering ethical issues regarding the Contact Tracing App”. It says: “If individual subjects do not give explicit consent, what mechanisms are in place to ensure broader societal consent? How will consent be designed so that it's understandable? Current consent practise focuses on individual consent. What is being planned with the app is not only consent to potentially collect information about one’s own location, but one’s proximity to others too. After the initial emergency response, this will need collective consent mechanisms and a critical approach to how the design and content design of these consent moments are put together.”

Talk of “collective consent mechanisms”, “after the initial emergency response”, seems to suggest a movement from individual consent to some kind of broad societal consent. This is a very different model and would need significant public debate to be sure that we, the public, are ok with this. Currently consent is the cornerstone of clinical practice and research. It doesn’t always do this in practice, but consent is intended to respect individual autonomy and dignity and human rights. Covid-19, as a public health emergency, might legitimise all kinds of actions which we wouldn’t normally accept, including limiting freedom of movement (requiring self-isolation) and some monitoring and a reduction of privacy (the App). Exceptional times need exceptional measures. We all agree. But is ‘future research’ exceptional? If not why assume that individual consent can effectively be bypassed? Why should “collective consent mechanisms” be used after the initial emergency at all? Should we not be asked for informed consent at the time when we can know what our data is going to be used for?

More worrying, we are being presented with this as the only way and we are told that the public health benefits outweigh any privacy concerns. But the truth is to track, trace and contain the virus we don’t need to collect any central data. In contrast a number of European countries have plumped for a de-centralised App. These countries are not holding any central data, the information is stored only the mobile phone of the user. To do the job we need there is absolutely no need for a centralised app.

The Joint Committee has called for legislation to regulate the use of the NHS App and a formal human rights assessment and this we support. However such legislation also needs to engage with the whole question of consent. The precise basis on which such a potentially extensive and indeed commercially valuable dataset will be collected and used in the short and the long-term needs to be made very clear from the outset and the subsequent use itself needs to be carefully monitored.

Track, trace and contain. But if government want to hold our data we need more clarity, and more reassurance that our fundamental human rights will be respected, and that and future research will be ethical. We are long way short of being able to ensure that this is the case.

Jean McHale, Professor of Health Care Law, Birmingham Law School and Heather Widdows, John Ferguson Professor of Global Ethics, Department of Philosophy.