Data: The key role in fighting against the Coronavirus pandemic (Opportunities and risks of the contact tracing Apps)

By Jutta Sonja Oberlin//

Recently, developers from all over the world including Google and Apple1 have been working on pseudonymous contact tracing apps.

These so-called Corona Apps could play a vital role in the fight against the virus, but they also raise serious privacy and data protection concerns. While these apps are supposed to help limit the further spread of COVID-19, they might also expose sensitive data belonging to the affected data subjects. This may include health data2 or, in some cases, even the location data of everyone using the app. Some apps collect real time data on the actual location and movements of their users to warn people if they have been in contact or near an infected person. This also helps the government3 to understand the spread of the virus, and to design appropriate measures and take actions accordingly.

To avoid incompliance or regulatory confusion, on April 8th the European Commission adopted recommendations to support Coronavirus containment measures through mobile data and apps. These recommendations set out key principals concerning data security and EU fundamental rights, such as privacy and data protection.4

In general, the processing of special categories of data, such as health data, is prohibited unless a special provision (Art. 9 (2) a-j GDPR) applies. In the case of the fight against COVID-19, the legal grounds for data processing is public interest in the area of public health. In this case, the Union or member state law is supposed to provide suitable and specific measures to safeguard the rights and freedom of the data subject.5 On April 17th 2020, the European Commission released its communication regarding Guidance for apps supporting the fight against COVID-19 pandemic in relation to data protection.6

Risk Assessments under the GDPR

When developing such an app, it is important to understand the obligations prescribed by law and to be aware that the GDPR is one of the strongest privacy laws in the world – the fight against COVID-19 did not change that at all!

The legal requirements must not be ignored, and while developing the app, data protection should be implemented from the beginning. Privacy by design is the most crucial part when it comes to the evaluation of risk and to find the appropriate technical and organizational measures7 to comply with the GDPR.

While evaluating the risk it is important to work closely with data protection authorities and to coordinate constantly with public health authorities.8

Thinking through the processing activity of a contact tracing app, the following measures seem to be important when it comes to GDPR compliance and to ensure a trustful and accountable use of the app:

  • The app should only be installed voluntarily

  • Transparency should be guaranteed (in all stages: infected / not infected)9

  • Data subject should remain in control of his/her personal data

  • No enabling of the data subject’s location – no use of GPS

  • Data should only be stored on the data subject’s device

  • Data should not be stored in a centralized database

  • In case data needs to be shared with the public health authorities only transfer data when the infection is diagnosed by a doctor10 to avoid false information. Also be transparent on the transfer of data to third parties such as public health authorities

  • The app should be designed in such a manner that the national health authorities are the controllers11 (always be aware of the possibility of a joint controllership)

  • Delete data when no longer necessary for the purpose for which data was collected12. This means a sunset clause should be included to prevent further processing activities post COVID-19

  • The controller must ensure the rights of the data subject13

  • The controller must take appropriate technical and organizational measures, such as the storage of data in an encrypted form using state of the art cryptographic techniques, logged access to the data when it is stored in a central server, or the possibility to activate Bluetooth without the activation of other location services14

  • Ensuring the accuracy of the data with appropriate technologies, allowing a very precise assessment of the contact15

The usage of location data

It is important to understand that the functionality of the contact tracing app does not require the use of location data. This means that the developer does not need to follow the movements of the individual to be able to trace the interactions of a person with others.

Using location data would be a total game changer. Not only in regards to the main principles such as data minimization16, fairness17 and purpose limitation18, but also when it comes to the major privacy and data protection implications. The Commission advises not to use location data in this context.19

Data minimization in the eye of contact tracing apps also means the exact time and location of the user should never be stored, not even on their own device. Since the app is working in a real time, the technology can trace and warn the user at the time of contact, which should be enough to fulfill the purpose of the app.

Decentralized Privacy – Preserving Proximity Tracing

DP-3T (Decentralized Privacy-Preserving Proximity Tracing), a decentralized, Bluetooth low energy-based contact tracing app developed by ETH Zurich and EPFL Lausanne has gone live on May 13th 2020 in a testing phase20. This app employs new Bluetooth technology which connects smartphones with each other anonymously and only stores personal data on the device of the data subject. The aim of the app is to 1. provide a technological foundation to help slow down the spread of COVID-19 and 2. minimize privacy and security risks for the data subjects and to ensure the highest level of data protection.21

In general, it’s really important to put safeguards in place against misuse. “There’s no guarantee that the system won’t be hacked. What’s more, there will be some people trying to do just that, because we know that health-related data is worth its weight in gold.”22

This means for example hackers trying to cross-check information to break the “anonymization”. The main security and privacy risks are:

  • Increased attack surface of the data subject’s device via app

  • The communication between the app and the backend servers could be hacked

  • Possible attack on Bluetooth (e.g. CVE-2020-0022)23

  • De-identification because of Bluetooth

  • False information24: Originally, the 21-year-old wireless technology Bluetooth was not designed for to determine the exact distance between two devices25 and thus the potential exposure to a COVID-19 patient. One device connects with the other using electromagnetic waves, sending short, repeated radio messages to announce the existence of the device to allow devices to pair. Under ideal conditions, the challenge is reliably received however, radio waves may be absorbed by environmental objects, for example, walls, cars, trees etc. This might reflect the signal and affect the strength of it26, which, in the end, makes the data incorrect. This means a device two meters away could physically appear to be 20 meters away.27 Google and Apple do use RSSI28 for their contact tracing app to determine the distance between people, which could result in individualized false alerts.

A good example of a breach in this case is when a user declares themselves as infected. This information is not only stored on the user’s device, but it is also sent to a centralized backend server, which means the server communicates with the app to update its database. This database is indispensable when it comes to the main reason of the app: to facilitate digital contact tracing of infected users. Hackers might try to read the communication between the app and the backend server.

Anonymization vs. pseudonymization

Talking about anonymized data is not the correct terminology when it comes to the GDPR.

Anonymization and Pseudonymization sound similar, but they are two distinct techniques that allow the controller/processor to de-identify data. The key here is whether the data can be re-identified or not.29 Recital 26 of the GDPR defines anonymized data as “data rendered anonymous in such a way that the data subject is not or no longer identifiable”. The bar is set very high when it comes to data anonymization.30 It all depends on the possibility of tracing back personal data to the data subject. If the apps use a “true” anonymization technique the developers should not worry about the GDPR, since the GDPR is not applicable to anonymized data. BUT it seems like it is in fact possible to trace the data back to the user, developers worry about hackers being able to trace back the information – even if the data is stored decentralized. Due to this possibility, it might be better to call it encrypted data so as not to confuse the users on the type of de-identification. This should also be an aspect of transparency to inform the users about the possibility of their personal information being traced back to them.

Also, COVID-19 is a notifiable disease, which means if you have declared yourself infected via the app, public health authorities must be notified.31 This makes it clear as to why this app cannot be anonymous.


It’s about time for the legislator to step in. The safeguards against misuse should be written down in a law to set a proper legal basis “The drafting of a law will make it possible to have a public debate on the role of digital technology in finding solutions to concrete problems.”32 This of course will not end the concerns about such apps, but providing a proper legal framework is key to be able to work towards compliance and to ensure the rights and freedom of the data subjects.

Jutta Sonja Oberlin is Manager at PWC Zurich in the area of Regulatory Compliance and Cyber Security, IAPP Young Privacy Professional Lead Switzerland and studied Law in Switzerland.


2 Art. 9 GDPR.

3 E.g. Singapore deployed and open-sourced a contact tracing app based on Bluetooth technology.


5 Art. 9 (2) i GDPR.


7 Art. 32 GDPR.

8 E.g. DP-3T is launched by ETH, EPFL and the Swiss Federal Office of Public Health.

9 Art. 12 and 13 GDPR.

10 Difficult for some countries since tests are rare and only available for people in risk e.g. UK.

11 and see recital 45 of the GDPR.

12 Art. 17 (1) a GDPR.

13 Chapter III of the GDPR.

16 Art. 5 (1) c GDPR.

17 Art. 5 (1) a GDPR.

18 Art. 5 (1) b GDPR.



22 Solange Ghernaouti, professor at the University of Lausanne.


24 Art. 5 (1) d GDPR: Personal data shall be accurate and, where necessary, kept up to date…

25 Marek Bialoglowy, Introduction to Bluetooth: “…universal short-range, low-power wireless connectivity as a way of eliminating cables between mobile phones and computers, headsets and other devices.”

26 Suresh Kumar, Swarun, Thesis: Ph. D., Massachusetts Institute of Technology, Department of Electrical Engineering and Computer Science, 2016.

27 Suresh Kumar, Swarun, Thesis: Ph. D., Massachusetts Institute of Technology, Department of Electrical Engineering and Computer Science, 2016.

28 Received Signal Strength Indication.


32 Solange Ghernaouti, professor at the University of Lausanne.

Tagged on: , ,