The Trust Deficit and Its Effect on Digital Contact Tracing Responses to COVID-19

July 14, 2020
Jolynn dellinger, Privacy Law and Policy professor at the Duke University School of Law and at the UNC School of Law


As the United States adopts strategies for reopening our economy while continuing to deal with Covid-19, we are being asked to create, implement and use a variety of technologies designed to help curb the spread of the disease and manage the crisis until a vaccine is available. Digital contact tracing is one of those technologies. Google and Apple have partnered to create the Google and Apple Exposure Notification System — an interoperable, decentralized infrastructure that allows public health authorities to build and implement exposure notification apps using Bluetooth technology. Such apps can facilitate contact tracing efforts. Not all digital contact tracing apps use the Google Apple System, however, and some apps are being implemented that collect both health and location data and use GPS as opposed to Bluetooth. In addition, some public health departments are partnering with private entities to build these apps. It is unclear whether private vendors have access to data collected by these apps, and there are no federal laws that govern what private entities that are not covered by HIPAA may and may not do with health and location data. Utah, and North and South Dakota have each implemented contact tracing apps to help contain the spread of Covid-19. Other states, like New York, Massachusetts and California are allocating large sums to expand manual contact tracing ability while continuing to explore a variety of ways to use technology. Universities and employers are also trying to come up with creative ways to reopen responsibly and are considering digital contact tracing apps as part of those plans.

Any entity considering whether to adopt surveillance technology in an effort to respond to this global health and economic crisis needs to engage in a straightforward, transparent, well-reasoned decision-making analysis that balances the effectiveness of the tech solution under consideration, on the one hand, with the privacy concerns raised by the use of that solution on the other. When the surveillance in question is enabled by the use of a voluntary app, rate of adoption will be a crucial component of efficacy. In this context, trust will be a key issue – citizens’ trust in government, consumers’ trust in corporations, our trust in public-private partnerships, and our trust in the laws of our country to protect our personal information. Arguably, the considerable trust deficit in each of these areas will substantially impede the use of voluntary apps and therefore, our ability to use technology and data to respond effectively to the COVID crisis. Trust matters, and we need to take the legislative and policy steps necessary to restore it.

The 5-Step COVID Tech Analysis

Despite hopes that stay at home orders, use of masks and social distancing, and limited shut-downs would help contain the virus, the adherence to such measures has been variable across the states, and the number of daily cases in the U.S. continues to trend upward as of late June. As entities contemplate the best strategies to re-open while coronavirus is still prevalent, a reasonable goal is to make the most effective use of technology while minimizing the incursion on privacy and the potential for misuse of data to the extent possible. Whenever we consider using emerging tech that involves surveillance of any kind, we should recognize the implications for privacy and civil liberties and engage in a deliberate, thorough, five-step analysis as follows:

1) What is the problem we are trying to solve?

2) What is the proposed solution?

3) Will the proposed solution be effective?

4) If so, would it intrude on our privacy/civil liberties?

5) If so, what controls could be put in place to sufficiently minimize those risks?

I want to briefly walk through this decision-making framework in a very high-level way in the context of voluntary digital contact tracing to provide an example of the important issues this analysis will raise.

1) What is the problem we are trying to solve?

Defining the problem we are trying to solve is an important place to start because, historically, we have often used technologies that are not particularly well crafted to solve problems we are trying to address. Throwing surveillance tech at a problem is a surefire way to end up with unwarranted privacy-invasive consequences.

For the sake of this demonstration, let’s define the problem as follows: Allowing people to gradually return to work to protect the economy from further damage — even though COVID-19 remains a very serious health threat and there is not yet a vaccine — and to continue to limit the spread of the disease while doing so.

2) What is the proposed solution?

In this post, we are discussing voluntary digital contact tracing apps. Mandatory digital contact tracing apps would arguably be more effective, but such apps would raise far more substantial and potentially prohibitive privacy and civil liberties concerns that are beyond the scope of this post. Even within the category of voluntary apps, however, there are a variety of types: a centralized model where data is collected and maintained by a central server vs. a decentralized model that facilitates notification of exposed persons but does not store individuals’ location or health information at a central location. The latter is more privacy and security protective but arguably less helpful to health authorities attempting to track trends and to pinpoint outbreaks. Because different technologies raise different privacy concerns, it is important to define any proposed solution with specificity in order to conduct this analysis and accurately assess the associated privacy and security risks.

I am using digital contact tracing apps as an example with which to demonstrate this 5-part analysis, but ideally this analysis should be applied to any type of surveillance technology that a state (or any other entity) considers using: immunity passports (vaccine, proof of antibodies); house arrest tech (quarantine enforcement); movement trend analysis; symptom monitoring apps (used by public health, universities, employers and others); thermal imaging cameras that can detect fevers (and use facial recognition like PopID); and drones. Identifying the problem and the proposed solution, measuring potential efficacy, assessing privacy and security risks, balancing the competing interests and mitigating known risks should be routine pre-adoption work.

3) Would voluntary digital contact tracing apps be effective?

Possibly, in certain circumstances; but surveillance technology is not a silver bullet.

Experts seem to agree that it is likely voluntary digital contact tracing apps will be effective only in context of a more holistic, comprehensive response. Here in US, at a minimum, we would need the following:

— Adoption of app by significant numbers of people. While an Oxford University study released in April was interpreted to indicate that 60% of the population would need to use contact tracing apps to effectively reduce the spread of the disease, the Oxford Study researchers provided further commentary in a more recent article published by MIT Technology Review suggesting that much lower levels of adoption could still be “vitally important for tackling Covid-19,” particularly as one part of a multi-faceted approach. Experts do agree, however, that blanket coverage is preferable and that the more individuals that use the apps, the more effective they are likely to be.1

— Compliant use by adopters. To make the contact tracing app fully effective, individuals need to be willing to report positive tests and those who receive notifications of exposure need to commit to taking the next step of getting tested, notifying others of their exposure and of positive test results if applicable.

— Free, widely available testing on demand. If an app notifies you that you have been exposed, you need to be able to get a test to confirm any diagnosis and let others know you have tested positive if necessary.

— Expansion of traditional manual contact tracing systems. These systems are already part of our public health system and have been used in the past to notify individuals of exposure to STDs, HIV, Ebola, measles and other infectious diseases. Estimates indicate that we currently have only 2,500 contact tracers in the United States, and Johns Hopkins suggests we would need approximately 100,000 to begin to adequately deal with COVID-19.

— Continued social distancing and responsible, pervasive use of masks and other Personal Protective Equipment.

— A robust, multi-faceted health care system response that includes available beds, available ventilators and other necessary equipment, adequate staffing, proper protections for health care workers to promote their safety, and preferably earlier stage intervention to promote strategically-timed healthcare responses.

— Using technology and a working definition of meaningful “contact” to minimize false positives, avoid overwhelming people with many notifications, and minimize related strain on testing resources.2

— Very importantly, commitments by all individuals notified of exposure by contact tracing apps to quarantine and self-isolate to minimize the exposure of others. This is, after all, the point of contact tracing apps. If we have no way of ensuring compliance with isolation or quarantine directions, how is the use of the tracing app really going to benefit us?

— Societal and employer support for people attempting to quarantine to avoid exposing others. If people expect to get fired as a result of calling in sick to work because of self-imposed quarantine efforts, for example, it is unlikely they will take such steps to protect others.

Considering the need for such a multifaceted response and the lack of so many of these elements currently in the U.S. approach to the crisis, especially free, on-demand testing, a reasonable person might conclude that the use of voluntary contact tracing apps is unlikely to be effective here at this time. For purposes of this analysis, though, let’s assume that any degree of uptake of such apps could be a helpful element in containing the spread of the virus3 and that therefore, we should support it for that reason. What about privacy concerns?

4) Assuming we could imagine a US in which the foregoing comprehensive approach to Covid-19 were prevalent, and that digital contact tracing would be one helpful aspect of that approach, would the use of digital contact tracing apps intrude on our privacy/civil liberties?

The short answer is yes — contact tracing is a form of surveillance. Where we go, when we go there, and who we see as well as our personal habits and patterns are discernible from our location information. As recognized by the Supreme Court in U.S. v. Carpenter and in U.S. v. Jones, location data is sensitive and highly revealing personal information. Even manual contact tracing affects privacy in that it requires the collection, storage and sharing of personal health information and personal movement (location information) by individuals. And today, that collection, storage and sharing is likely to have a digital component. So protecting the privacy and security of personal data will be imperative whether contact tracing is effected manually or digitally.

Digital contact tracing apps will require, in some applications, the collection and sharing (and possible storage) of health information, data about family, friends and contacts, and location info. Some digital contact tracing apps use GPS and some collect and store location data at a central server. Centralized databases of sensitive information are attractive to hackers and susceptible to attack and theft. Exposure notification technologies like those supported by Google and Apple, by contrast, use blue tooth proximity data and a decentralized system, do not record locations, and are more privacy protective.

Given the lack of laws governing privacy in the United States and current practices of corporations, it is possible, if not probable, that commercial actors and private entities that collect personal data for use in the COVID-19 response will repurpose and monetize that data unless they are explicitly prohibited from doing so. While public health entities may be more likely to try to protect data, it is unlikely that most public health entities have the resources necessary to build and implement all of these apps. Several states, including North and South Dakota and Utah, have already been working with private companies

to create and manage these apps, raising issues about the privacy and security of the data collected. Government entities and any commercial or private entities developing and using contact tracing apps should be able to use data only for the very limited purpose of contact tracing and for no other purpose. Even if an app is decentralized and uses only Bluetooth technology, a user would have to leave Bluetooth on constantly in public to make use of the app, which would expose that user to all the other entities that use Bluetooth signals to track and market products to consumers.

Secondary use of the data by government entities – law enforcement, ICE, health research on other issues, and other federal agencies – is also a probable eventuality of contact tracing data collection. Like commercial secondary use, use of COVID-related data by government entities for non-COVID related purposes should be strictly prohibited. A bright line rule would help build trust in representations of limited data use.

Another aspect to the privacy problem created by the use of this technology is the serious danger that new surveillance permitted by new technology becomes the new normal. If history teaches us anything, this is a result we would have to take affirmative steps to prevent.

5) So, assuming both some degree of efficacy and serious privacy concerns, are there sufficient privacy and security safeguards that we could employ to allow us to benefit from the technology?

Possibly. The answer to this question involves two discrete topics:

A) The design of the tech at issue, and

B) Policy and law: the rules and regulations around apps and the use of personal data; and adequate restrictions on use of the data by the government, law enforcement, and corporations.

As mentioned above, different types of contract tracing have different effects on privacy. From a design/technology perspective, more privacy protective apps: use Bluetooth tech instead of GPS; feature decentralized structure instead of centralized databases; provide functional anonymity of users; and rely on voluntary adoption in that the technology can be disabled, turned off or foregone completely. Google and Apple have designed some limitations into the tech itself as well as restricting data collection and limiting those who will be allowed to use the tech to entities working with public health departments. Each of these protections may decrease efficacy to different degrees depending on the goals sought to be achieved as discussed in questions one and two of this framework.

Even with privacy protective design, however, we need explicit policies in place to guide the implementation of emerging technologies in this area and an investment of resources commensurate to the challenge of ensuring compliance with those policies. Ideally, we would have a legal framework in place that provided the basis for such policies, but we don’t. So the onus will be on state governments and the individual entities commissioning and utilizing this tech to do so in a manner that adequately protects individual privacy. In addition to being voluntary, other limitations that would represent best practices include:

· Limitations on collection of data to that which is strictly necessary to accomplish the specified purpose of the apps

· Strict limitations on alternative uses of data

o Strict limitation to use for COVID-19 purposes

o No access by law enforcement and

o Potential civil and criminal consequences for misuse, abuse or monetization of COVID related data

· Limited retention of data and complete deletion of such data from all servers

· Limitations on the categories of people with access to data

· Sunset provisions

· Legal and administrative measures that create oversight and accountability such as: a state-level task force, a COVID-specific PCLOB, mandatory reporting to state legislatures, and appropriations for privacy organizations to monitor and audit both policies and compliance

· Legal resources and consequences to ensure entities have an incentive to comply with applicable rules, regulations and policies

· App developers need to implement processes and employ people necessary for accountability (privacy officers), have public-facing privacy policies, and a way for individuals to access any data about themselves

· And finally, some technologies, like facial recognition, should be explicitly excluded from any contact tracing technology given the substantial risks to privacy and potential for abuse they create

Regarding voluntary contact tracing apps as they exist today, the likely minimal efficacy of these apps combined with the substantial privacy and security risks that they pose and our lack of a legal framework to ensure protection of personal data collected and/or used by such apps lead to the conclusion that the risks currently outweigh the benefits. To be clear, this is not to say that voluntary digital contact tracing apps are never worth the tradeoff; this is to say, however, that it is irresponsible to promote the use of such apps without recognizing and taking measures to protect privacy and security to the greatest extent possible. If states do choose to use them, they should do so only with explicit adoption of privacy-protective design and policy parameters including laws or executive orders that require adequate policies, processes, resources, oversight and consequences for non-compliance. See https://www.politico.com/news/2020/06/10/google-and-apples-rules-for-virus-tracking-apps-sow-division-among-states-312199 and https://9to5mac.com/2020/07/13/covid-19-exposure-notification-api-states/ for update on state adoption of contact tracing tech.)

Building apps on the Google and Apple Exposure Notification System is one way to promote use of apps that help contact tracing efforts while protecting a degree of privacy. If states alternatively elect to use apps that use GPS and collect and store sensitive information, states could take the additional step of requiring any third party that builds such an app in partnership with a state to have a privacy policy and to make an explicit commitment not to sell, share or monetize user data and not to use data for any purpose that is not COVID related. Apple and Google have already required apps that use the partnership’s technology to be affiliated with a public health entity, but they can do more. Apple and Google could take the additional step of prohibiting any contact tracing app from being offered in the Google Play Store or the App Store if the app’s privacy policy does not contain an affirmative commitment not to sell, share or monetize data collected for a COVID-related purpose and not to use that data for any secondary purpose.

What’s Trust Got to Do With It?

In short, everything.

These apps and similar voluntary digital technologies deployed to battle the COVID-19 contagion will only work if people use them. And people are evidently reluctant to do so. One recent study indicates that only 42% of Americans surveyed would be willing to use digital contact tracing apps, and that there is significant confusion and concern about the features and operation of such apps. Another study commissioned by Avira supports a much higher degree of skepticism and reluctance: 71% of respondents were unwilling to use a contact tracing app, and 44% of those individuals cited privacy and security concerns while 35% cited lack of trust in-app providers. See also https://www.techrepublic.com/article/most-americans-say-no-to-coronavirus-contact-tracing-apps/. Earlier polls indicated that 3 out of 5 Americans surveyed are unable or unwilling to download a contact tracing app. The unwillingness factor is, at least in part, attributable to privacy and personal freedom concerns, and respondents indicated varying degrees of lack of trust in government and corporations.

Even under normal circumstances, American citizens are making decisions about whom to trust with our sensitive personal data in the context of a highly unregulated environment. Now we are adding a global pandemic into the mix. In the United States today, given our lack of a comprehensive federal privacy law, none of the proposed safeguards discussed above are guarantees – they are merely items on a wish list. Despite agitation for a comprehensive privacy law for the past twenty years, we simply do not have a framework in place that is capable of protecting personal data in the hands of corporations or used in the context of public/private partnerships. Nor, given the broad-brush government surveillance that became normalized after 9/11, do we have any assurance that the data collection undertaken to deal with this immediate health crisis will be jettisoned when the crisis is over. We may also legitimately have concerns that surveillance methods will continue to be used after the crisis is over, and that we will have no idea that it is happening.

All of these factors underscore the importance of trust. Past behavior of companies like Facebook and Google — whose business models are based on collection, use and monetization of our data — and past conduct by the federal government, most recently in the aftermath of 9/11, when combined with lack of comprehensive protection for privacy at federal level in US, have eroded and compromised trust in technology and use of personal data. Now we are seeing a trust deficit — the consequence of our failure to act more carefully and responsibly in this area.

Both technology and data can be tremendously helpful tools in our ability to combat this disease and its spread. This situation demonstrates starkly how much it matters that we, as a country, choose a legislative manner of dealing with privacy, data use and technology that will, going forward, enable people to trust technology, and those who want to use it for good, when circumstances like this arise again, which they most assuredly will. Some members of Congress have recognized these issues and have taken steps in the right direction for COVID-19 related apps by proposing legislation such as the Exposure Notification Privacy Act, the Covid-19 Consumer Data Protection Act and the Public Health Emergency Privacy Act. Hopefully, members of Congress will collaborate expeditiously and provide a privacy-protective framework pursuant to which our nation’s digital COVID-19 response can proceed. If Congress will not pass such legislation or, preferably, put in place the comprehensive privacy framework that our country so desperately needs, state legislatures and Governors must act quickly to do so as they implement and promote digital technologies to respond to COVID-19.

Endnotes


1 It is worthy of note that both inability and unwillingness to use these technologies are relevant to uptake. Inability may be attributed to several factors: lack of access to devices because of cost, or to services because of geographic location, and unfamiliarity with or inability to use the technology – possibly as a result of age. These factors are important because older Americans are particularly vulnerable to COVID-19, and lower-income individuals may be more likely to 1) work at jobs that cannot be done from home and that are necessary to keep the economy moving, and/or 2) to live in more crowded environments – both of which create a higher risk of exposure to the disease.

2 While exposure notification apps can let a person know that they have been exposed, the app will not tell the individual where or exactly when the exposure happened. Accordingly, it may be difficult for a person to judge the actual danger of exposure. If they were in a grocery store wearing PPE, or sitting in a car next to an infected person in a different car, the risk may be low to none.

3 When promoting the use of digital contact tracing apps, states need to be clear with citizens that an exposure notification app is only useful to notify individuals about the positive status of other individuals using the app and the consequent potential exposure of the user herself. Because infected individuals NOT using the app will not trigger notifications, Individuals cannot conclude that they are safe or that they have not been exposed simply because the app has not notified them of exposure. In short, no notifications does not mean no exposure. This is yet another reason why the digital apps are more useful the greater the number of people who use them. See Contact Tracing Apps are NOT a Solution to the Covid-19 Crisis by Ashkan Soltani, Ryan Calo and Carl Bergstrom (April 27, 2020) for a comprehensive discussion of the under-inclusiveness and over-inclusiveness of contact tracing apps and the dangers of false positives and false negatives.

Please follow and like us:
Categories: Healthcare

Related Posts

Healthcare

COVID-19, Hospitals, and Cybersecurity

Another Type of Virus? COVID-19, Hospitals, and Cybersecurity July 13, 2020 David Hoffman, Steed Family Professor of the Practice at the Duke University School of Public Policy and Associate General Counsel at Intel Corporation Art Read more…

Healthcare

Medical Licenses and COVID-19

Could COVID-19 Put an End to State Medical Licensures?​ July 5, 2020 JAYMI THIBAULT, MPP CANDIDATE 2021, DUKE UNIVERSITY SANFORD SCHOOL OF PUBLIC POLICY Bridget Colliton, J.D. Candidate 2022, Duke University School of Law    Read more…