COVID-19 Contact Tracing and Mission Creep

  • Apple and Google have officially released the first version of their contact tracing API, referred to as exposure notification API, in their respective developer builds of Android and iOS.
  • The opt-in automated system uses Bluetooth-based identifiers to keep track of whether a smartphone's owner has come into contact with someone who is later positively diagnosed with coronavirus.
  • Now it's up to governments around the world to decide whether they'll choose the exposure notification system, which has privacy and usability benefits, or whether they will create their own apps using their own technology to empower their public health departments with additional data.
  • The U.K.'s National Health Service (NHS), for its part, has outlined plans to reject Apple and Google's anonymous contact tracing solution in favour of an internal solution that uses a centralised approach, giving it more insight into COVID-19's spread.
  • In a separate development, Australia's COVIDSafe contact-tracing app has raised security concerns over its decision to change identifiers only once every two hours and when the app is running, thus making it easy for the government to track its citizens using the app.
  • India's contact-tracing app isn't far behind either when it comes to privacy issues — the Android version of Aarogya Setu app has been found to leak users' latitude and longitude information to a YouTube server (now fixed), and despite its opt-in nature, the app is fast becoming a compulsory download for gig workers and government officials alike, monitoring their location at all times, even if it is running in the background and processing that data on central servers.
  • Aarogya Setu allows for personal information to be used by the government in "anonymised, aggregated datasets" unless a person tests positive, in which case the data can also be shared with other people "to carry out necessary medical and administrative interventions."
  • The Indian government is also pushing smartphone makers to pre-install the app in their devices, in addition to making it compulsory by mandating that individuals register on it and set it up before using their smartphones. (What's more, a similar solution is in the works for feature phones as well.)
  • Another dramatic example of location tracking is in Taiwan, where authorities have deployed an "electronic fence" around quarantined households — alerting police if citizens under quarantine leave the home or even turn off their phones. But the system, as "dystopian" as it may seem, indeed seems to be working, helping its citizens move around.
  • These so-called "public health surveillance" measures are being passed off as emergency steps to tackle the outbreak and isolate carriers of the virus, but it poses fresh challenges about to what extent they're in the larger interest of the society and whether they may become the new normal. In the debate of privacy versus public health, the right to individual privacy can never be completely sacrificed in favour of public health considerations.
  • The idea that widespread and pervasive surveillance makes people safer is not only incompatible with a free and open society, but also warrants the question: "Is this data needed?" rather than "Can this data be useful?"
  • Scrambling to deploy smartphone tools to rein in the pandemic is one thing, but it has led to a confusing patchwork of options, posing a risk to the privacy and security of users. It's essential that government-mandated apps come with appropriate security measures to prevent people from being snooped on, hacked, and spammed.
  • Viewed in that light, Apple and Google's decentralised solution is an important step because it makes zero use of location data (but it doesn't prevent the possibility of linkage attacks). More important, however, is that the systems being developed and deployed don't outlast the length of the pandemic — the collected data shouldn't result in databases that would allow de-anonymisation of users of the system or used for purposes beyond the scope of the emergency, thus resulting in mission creep.
  • "Proximity tracking apps should not be repurposed for other things, like tracking more mild seasonal flu outbreaks or finding witnesses to a crime," the Electronic Frontier Foundation cautioned earlier this week.
  • This necessitates that there are plans in place to phase out the apps once the outbreak is over. Apple and Google's tech can be turned off on a region-by-region basis, but there're no such guarantees for other apps enforced by government authorities. (On a side note, the fact that Big Tech has been able to come up with a solution on this scale is yet another reminder that health-data privacy is already largely out of your control.)
  • A recent study by epidemiologists at Oxford University estimated that 60 percent of the population in a given area would need to use an automated contact tracing app to contain the virus. As a result, this rush to adopt virus-tracking technologies may entrench new forms of government monitoring and social control even if the apps do not prove effective in fighting the coronavirus. Worse, they could even normalise surveillance.
  • The effectiveness of these solutions hinges on sufficient trust for widespread public adoption, and any insufficient privacy protections will only undermine that trust.
  • While there's no doubting the potential of tech-based solutions to containing the outbreak, it's worth considering the privacy trade-offs, and balancing the need for the use of technology and data for the public good versus ensuring transparency.

Comments