The use of contact-tracing apps on smartphones is a well established part of the way many other countries are trying to control the spread of Covid-19 infection and make public spaces safe again. In China, for example, every public space, from shops and factories to parks and even taxis, must all display a unique QR code (a type of matrix barcode), which members of the public must scan into their smartphones.

Their own ID and infection status is then checked and either approved or rejected, while their location is also tracked and recorded. If they have been exposed to the virus, their potentially infected status is recorded on their device. They must then quarantine for 14 days and only once that has been logged onto the system will their status change from red (positive or at risk) or amber (uncertain as to risk) to green (clear).

A similar system is in operation in Moscow and other parts of Russia, where you cannot even ride bike around the city without your own digital permit code. But while China and Russia might be considered natural candidates for surveillance states, they are by no means the only ones to impose intrusive contact tracing regimes. Elsewhere, however, the level of compulsion may be less authoritarian.

In South Korea, where credit card activity and mobile phone data is also used, contact-tracing apps have only achieved widespread acceptance on the basis of assurances that personal details would not be shared or disclosed, after alerts were sent out identifying gay clubs and nightlife hotspots as posing higher risks of transmission, which in turn provoked a homophobic backlash and a drop in voluntary testing.

In Singapore, where the government launched its contact tracing app TraceTogether in March, takeup has not been sufficient to enable the system to work as well as it should, and a new digital check-in system called SafeEntry has been introduced to manage public locations such as schools, shops and hotels, where individuals are likely to be in close contact for prolonged periods.

European countries have generally been more conscious of the privacy implications, which is not surprising given the region’s more stringent data protection ethos. This has affected, in particular, their decision whether to adopt a ‘centralised’ or ‘decentralised’ approach to data gathering.

Centralised or decentralised?

There is a helpful explanation of the difference between the centralised and decentralised architecture on the UK Human Rights Blog (What are the data privacy considerations of Contact Tracing Apps? by Rafe Jennings):

“On a decentralised architecture, after a positive diagnosis, one’s personal identifier is uploaded to a server which then broadcasts the identifier to all other phones running the app. One’s proximity contacts are recorded on one’s phone; if there is a match between a proximity contact and an identifier received by a phone, the user is alerted to the possibility that they may have contracted coronavirus. The central server therefore does not contain information regarding who may have contracted coronavirus from the matches. On a centralised model, one’s proximity contacts are uploaded to a central server, where the matches are made and then sent to the relevant phones.”

The advantage of the centralised approach is that it enables the government to build up a complete picture of the spread of the virus nationwide, but the downside is that it opens a back door into a potentially very intrusive surveillance regime, which could easily be abused with oppressive consequences for personal freedom of movement and association.

The NHS Covid-19 app developed in the UK by NHSX (the body responsible for setting NHS data usage policy and best practice) uses the centralised model, despite the concerns raised about privacy and performance. That initial version was rolled out for trialling in the Isle of Wight earlier this month, with Health Secretary Matt Hancock declaring grandiosely that ‘Where the Isle of Wight leads, Britain follows’.

According to a recently published survey by law firm Norton Rose Fulbright, France, Poland and Italy also back a centralised system for their contact tracing apps. But other European nations including Switzerland, Austria and Estonia, who are using a solution developed by a Swiss team, as well as Canada, Hong Kong and South Africa all favour the decentralised approach.

Although Germany initially adopted a centralised approach, it has now switched to the decentralised. It appears that this change was influenced more by technical considerations than privacy concerns: the decentralised protocol is favoured by Apple and Google, and designed to work much better on their devices. The centralised version would not work well on Apple devices because users would need to leave their device unlocked and the device running in the foreground for Bluetooth exchanges to happen, which would seriously drain the battery.

GDPR and ECHR

The Joint Committee on Human Rights in its report to Parliament on Digital Contact Tracing points out that while the government has a duty to protect life under Article 2 of the Human Rights Convention, ‘any such app will have an impact on the right to private and family life, protected under Article 8 of the ECHR.’ Moreover, the collection of health data via such an app, even if partially anonymised, would also engage the General Data Protection Regulation, as the information law blog Hawktalk explained in COVID-19: how the GDPR applies to trace and track.

The Open Society Foundation have published a detailed legal opinion on this and other tech responses to Covid-19, drafted by Matthew Ryder QC and Edward Craven from Matrix chambers, along with Ravi Naik solicitor and legal director AWO, a new data rights agency, and Gayatri Sarathy of Blackstone chambers. They conclude that both centralised and decentralised app systems

“would engage the right to respect for private life under Article 8 of the European Convention on Human Rights (‘ECHR’). Any interference with Article 8 would have to be in accordance with the law, proportionate and necessary. We consider that the decentralised systems … are likely to be in accordance with the law, proportionate and necessary. In contrast, a centralised system would result in a significantly greater interference with users’ privacy and require greater justification.”

As Paul Bernal, lecturer in IT law at University of East Anglia points out: ‘the contact tracing app hits at three of the most important parts of our privacy: our health, our location, and our social interactions.’ (See his blog post Contact tracing, privacy, magical thinking – and trust!) Though it may help prevent the spread of an infection, there is huge potential for misuse:

‘The Stasi’s files were full of details of who had met whom and when, and for how long – this is precisely the kind of data that a contact tracing system has the potential to gather.’

He adds that ‘There appears to have been no Data Protection Impact Assessment done in advance of the pilot – which is almost certainly in breach of the GDPR.’

Trust

We have seen in other countries how, in the absence of authoritarian compulsion, any substantial public health benefit of contact tracing apps depends on their uptake among a willing population. Both Singapore and South Korea have battled with lack of public support. It’s a problem that the NHS has faced before, says Bernal, when its ‘care.data’ scheme ‘collapsed for similar reasons – it wanted to suck up data from GP practices into a great big central database, but didn’t get either the legal or the practical consent from enough people to make it work.’

The report by Ryder and others is also clear on this. ‘Government consultation and cooperation with a wide range of experts will be important to address concerns and ensure that any system has public trust.’ Relevant expertise includes not just technical capabilities, but also on law and human rights.

The debate over centralised versus decentralised architecture may have harmed public trust already. The fact that NHSX is now talking about developing a second version of its app, using the decentralised system, either in parallel with its earlier centralised one or to replace it (in what developer’s call the ‘fail fast and fix’ approach), may be all very well from an agile development perspective, but in PR terms it smacks of yet more government dithering and u-turning.

Moreover, any suggestion that data may be manipulated for political purposes, in the same way as the number of deaths or the number of tests appear to have been routinely manipulated – will scupper the chances of people engaging with the app. A more structural problem is that any such app is not a complete solution; it has to work as part of a regime of effective and widespread testing, otherwise it depends on most of its information coming from people self-diagnosing and self-reporting, which is open to abuse as well as human error.

In Ireland a group of civil societies, scientists, and academics have written an open letter published by the Irish Council for Civil Liberties urging the Health Service Executive (HSE) and the Department of Health when developing its own app to embrace transparency and promote trust, to design for privacy and data protection, to limit the purpose to contact tracing (and not location tracking) and address all such concerns before any launch.

Similar concerns have been expressed by bodies over here. The Joint Committee on Human Rights in their report (above) observed that

‘The implications of such an app are so widespread, significant, and, as yet, subject to limited public examination, that they should be subject to the in-depth scrutiny of Parliament at the earliest opportunity … The implementation and oversight of this app must, in our view, be urgently placed on a legislative footing …’

The Biometrics Commissioner in a recent statement on the need for such oversight, to put it on a similar basis to the use of biometrics for police work, noted that a group of university lawyers (led by Professor Lilian Edwards at the University of Newcastle) have produced a suggested Coronavirus (Safeguards) Bill that might meet the case.

Given the privacy and data protection concerns, the need for careful scrutiny is clear. But it’s not just about compliance. Bad judgment can be as harmful as bad faith. The app will not work unless there is enough public trust in its safety and efficacy for people to use it, and that also depends on its having a technical architecture that actually works, and on other factors alongside, such as an effective testing regime.

Paul Magrath is head of product development at the Incorporated Council of Law Reporting for England and Wales (ICLR) and a member of the Transparency Project.