Sign In

Communications of the ACM

Privacy

Digital Contact Tracing May Protect Privacy, But It Is Unlikely to Stop the Pandemic


View as: Print Mobile App ACM Digital Library Full Text (PDF) In the Digital Edition Share: Send by email Share on reddit Share on StumbleUpon Share on Hacker News Share on Tweeter Share on Facebook
smartwatch displays COVID-19 update

Credit: Kevin R. Yin / Shutterstock

It is difficult to imagine a timelier topic for this inaugural Communications Privacy column than the privacy issues associated with COVID-19 apps. Against the backdrop of protests around the world opposing racism and police killings of Black people, we have a newly found understanding of the need for protection from surveillance, while also feeling the urgency of shutting down the spread of a deadly virus. While many computer scientists are looking to technology for privacy-protective ways to track COVID-19 exposure, Privacy-enhancing technologies (PETs) may prove ineffective without more widely available COVID-19 tests, human-centered design, and complementary laws and policies.

As the COVID-19 pandemic spread in spring 2020, researchers and public health officials pursued digital contact tracing and exposure notification tools to assist human contact tracers. Initial efforts to build these tools focused on utility but were quickly met with questions about privacy. Although there is compelling public interest in sharing data to reduce virus spread, concerns arose that this data might be used for other purposes. Indeed, as protest marches became commonplace and police sought out instigators, rumors spread that police might be using data collected by contact-tracing apps. While I have seen no evidence that this actually occurred, the concern is legitimate and may slow app adoption. In the U.S., people of color have been disproportionately hard hit by COVID-19 but may also have the most to fear in using these apps.

Digital contact tracing and exposure notification might be ideal applications for PETs. An app that could notify users that they had been exposed to COVID-19 without leaking locations or personal information could simultaneously protect both public health and privacy. However, PETs alone may not solve this problem.

Back to Top

Digital Contact-Tracing Technology

Contact tracing and exposure notification apps run on mobile phones and log either the phone's location or the presence of other nearby phones. The location approach involves sending location information to a centralized server so that users who were recently co-located with a user who tested positive for COVID-19 can be identified. Alternatively, an infected user's locations could be broadcast to other users so that their apps can check for co-location. The proximity approach does not require storing location and instead logs an identifier associated with each phone the app detects as being nearby for some period of time (for example, 10 minutes). If a user tests positive, their identifier is broadcast, and apps can check to see whether the infected person's identifier is in their proximity logs. The proximity approach is more privacy protective as it determines only that two users were near each other, not all the places they were located.

Research teams around the world have been working on privacy protective protocols for contact tracing and exposure notification. These include the Private Automated Contact Tracing (PACT)a group led by Massachusetts Institute of Technology researchers and the European Decentralized Privacy-Preserving Proximity Tracing (DP-3T) consortium.b

Google and Apple jointly developed a Bluetooth-based "Exposure Notification" API for Android and iOS platforms that public health agencies can incorporate into contact-tracing apps. It uses a decentralized and privacy-protective approach, which includes cryptographically generated rotating identifiers that make it more difficult (but not impossible1) to trace an identifier back to an individual.

Exposure Notification has not yet been built into many apps, although new apps that use this API are expected to launch throughout fall 2020. Many apps seem to be using their own approaches, and privacy and security issues are common. For example, concerns have been raised about apps that may leak sensitive information through security holes. A mid-June report3 assessed mobile-phone-based contact-tracing apps from government entities around the world and found most were susceptible to being tampered with to allow attackers access to sensitive data.

Back to Top

Trade-Offs

Some apps use good technical approaches to limit data leakage, but in doing so, they may limit their utility. I can imagine an app informing me that sometime in the past week, or more precisely last Tuesday, I was in proximity of someone who has tested positive for COVID-19. I would have questions. How long was I near them? Were we outdoors? Were they coughing? Were either of us wearing a mask? Were we talking face-to-face or standing silently six feet apart? Or were they on the other side of a solid wall? I could imagine feeling upset and wanting to talk to a human rather than receive a notification from an app.

I would likely pay attention to my first positive notification from a contact-tracing app. I might seek out a COVID-19 test and would probably quarantine myself. But should I stay away from the other members of my household? Should they quarantine themselves too? (More questions I might want to ask a human.) After the first notification, I might have more experience and know what to do, but I would probably soon start to ignore these notifications, assuming them to be false positives. This problem is exacerbated by the lack of widely available rapid COVID-19 tests in most of the U.S. and many other countries.

There are ways to design an app to answer many of my questions, mitigating some concerns. This may require that we give up some privacy. Maybe the app should store both location information and proximity information so it can communicate where the exposure occurred. Maybe the infected person could grant permission for additional information—such as whether they wore a mask in public—to be transmitted to people receiving notifications. Maybe they would consent to sharing their name with friends who were receiving notifications. Laws and policies restricting the use of contacting-tracing information might reduce privacy concerns and encourage people to allow more data to be collected and shared. In addition, people may be willing to allow an app to collect more data in certain places—I might allow an app to collect precise location information at the grocery store or park, but not at a doctor's office or protest march. However, it may be challenging to build an app that supports this sort of control without requiring users to spend a lot of time and effort setting it up.

Another concern is people may falsely report they have been infected to cause mischief or to keep people home in order to shut down school or even to disrupt an election. This problem is being addressed by requiring public health officials, doctors, or testing labs to verify positive test reports before notifications are sent, although this may reduce convenience, privacy, or timely notification.

Back to Top

Adoption

A study conducted last spring in the U.S. found participants generally preferred a centralized contacting-tracing approach that would share identities and location of infected users with public health authorities rather than a decentralized and more privacy-protective approach that did not share data. Approximately half of the participants reported a willingness to install such a centralized app, while about a quarter of the participants indicated they would be unlikely to install any contact-tracing app. Other U.S. surveys have also found that about half of smartphone users report being likely to install a contact-tracing app.6 This does not take into account the people who do not own smartphones and thus would not be able to use these apps. In the U.S., some of the demographic groups most at risk are least likely to own smartphones. Furthermore, current adoption of these apps appears low, even in countries that introduced apps last spring. For example, in May it was reported that contact-tracing apps had been adopted by only 38% of the population in Iceland and 20% in Singapore. Yet researchers estimate adoption rates of at least 60% are necessary for these apps to be effective.5 Even in France where the StopCovid app was activated by over 1.8 million people in June, the app notified only 14 people that they may have been exposed.1

At Carnegie Mellon University, a team of researchers developed an anonymous contact-tracing app called NOVID that distinguishes itself from other apps by using ultrasound in addition to Bluetooth to improve accuracy and allow it to provide notification recipients with information about how close they were to an infected person. One of its features is that it tells users if they are in proximity of other NOVID users even when no infection has been reported. I live in a neighborhood near Carnegie Mellon, where there is likely more interest in NOVID than in other places. In the four months since I installed NOVID, it has detected that I have been near only one other NOVID user, despite the fact I have brief contact with numerous people through daily outdoor exercise and regular errands. Thus, NOVID is not yet particularly useful to me.

While public adoption has been slow, private companies and universities are starting to mandate their employees and students use apps to trace contacts and report symptoms. Some companies are mandating the use of apps or wearable devices that immediately alert wearers when they are too close to other users. Symptom-tracking apps can provide a short daily questionnaire for users to report any COVID-related symptoms. However, all of these approaches are raising concerns. Employees and students wonder where their information is sent and how it can be used. The University of Connecticut conducted focus groups and found students were unlikely to report symptoms such as headaches that occur frequently for reasons unrelated to COVID-19, for fear of being forced to quarantine and miss exams or social events.c

In contrast to apps, wastewater monitoring may be more privacy-protective (assuming samples are taken as water exits the building rather than with every flush), easier to deploy, and more effective at detecting COVID at universities, providing an early warning when someone in a monitored building is infected. All the occupants of a University of Arizona dorm were tested after the virus was found in their building's wastewater in August. As a result two asymptomatic students were discovered before they spread the virus further.7

Back to Top

Challenges

While efforts to use apps to help control a deadly virus and protect privacy are laudable, early efforts do not look promising, and some experts have concluded the risks of contact-tracing apps might outweigh their potential benefits.2,8 Challenges remain in developing and deploying widely apps that are highly effective and usable.

Technologists have focused on building privacy-protective decentralized apps, but a centralized approach with legal protections to limit data use might be more beneficial to public health and more understandable and acceptable to the public. The problem these apps are trying to solve is not just a technology problem, and digital technology alone is unlikely to be the solution. As with many privacy problems, solutions should involve both policy and technology.9 Laws and organizational policies must ensure sensitive information is not used for purposes unrelated to public health; digital tools must be trustworthy, understandable, and usable; and public health organizations and rapid COVID-19 testing must be part of the solution.

Back to Top

References

1. Dillet, R. French contact-tracing app StopCovid has been activated 1.8 million times but only sent 14 notifications. TechCrunch (June 23, 2020); https://tcrn.ch/3bW1g1P

2. Gebhart, G. COVID-19 tracking technology will not save us. Electronic Frontier Foundation (Sept. 3, 2020); https://bit.ly/3mo8jF9

3. Goodes, G. Report: The Proliferation of COVID-19 Contact Tracing Apps Exposes Significant Security Risks. Guardsquare. June 18, 2020; https://bit.ly/2E0bjX7

4. Greenberg, A. Does Covid-19 contact tracing pose a privacy risk? Your questions, answered. Wired (Apr. 17, 2020).

5. Kreps, S. et al. Contract-tracing apps face serious obstacles. Brookings Tech Stream. (May 20, 2020); https://brook.gs/2FCrGJE

6. Li, J. et al. Decentralized is not risk-free: Understanding public perceptions of privacy-utility trade-offs in COVID-19 contact-tracing apps. (May 25, 2020); https://bit.ly/2DXWF2q

7. Peiser, J. The University of Arizona says it caught a dorm's covid-19 outbreak before it started. Its secret weapon: Poop. The Washington Post (Aug. 28, 2020).

8. Soltani, S., Calo, R., and Bergstrom, C. Contact-tracing apps are not a solution to the COVID-19 crisis. Brookings Tech Stream. (Apr. 27, 2020); https://brook.gs/3iqvI6y

9. Spiekermann, S. and Cranor, L.F. Engineering privacy. IEEE Transactions on Software Engineering 35, 1 (Jan.-Feb. 2009), 67–82; doi: 10.1109/TSE.2008.88.

Back to Top

Author

Lorrie Faith Cranor (lorrie@cmu.edu) is Director and Bosch Distinguished Professor in Security and Privacy Technologies, CyLab Security and Privacy Institute and FORE Systems Professor, Computer Science and Engineering & Public Policy, Carnegie Mellon University, Pittsburgh, PA, USA.

Back to Top

Footnotes

a. See https://pact.mit.edu/

b. See https://github.com/DP-3T/

c. See https://bit.ly/2FAEJLM


Copyright held by author.
Request permission to (re)publish from the owner/author

The Digital Library is published by the Association for Computing Machinery. Copyright © 2020 ACM, Inc.


 

No entries found