On July 31, the Government of Canada released the national COVID-19 exposure notification app, COVID Alert: a digital tool designed to enhance officials’ ability to detect and control the spread of the coronavirus. This effort follows digital contract-tracing apps implemented around the world from Singapore to Switzerland, Australia to Iceland. Digital contact-tracing efforts are perceived to enable faster, broader and more accurate identification of individuals potentially exposed to COVID-19. Yet little evidence exists demonstrating the effectiveness of these apps to limit viral spread, and in many cases, like China or South Korea, opportunities are not properly weighed against the risks to individuals and entire communities. Apps like COVID Alert also raise other concerns linked to accessibility and equity, efficacy and public trust.
Other digital technologies have been used to supplement the COVID-19 response, yet receive less media attention than contact tracing. These technologies are aimed at social monitoring and control (like Poland’s use of geo-located selfies to track citizens in quarantine), novel forms of public communications (like the WHO’s automated messaging via WhatsApp), remote diagnostics (like India’s digital stethoscope to detect heart or lung abnormalities in potential COVID-19 patients), and remote health-care delivery (like drones to deliver personal protective equipment, test kits and hygiene kits to the Georgina Island First Nation), among other objectives. Our ongoing research has found that situating digital contact tracing in the broader digital response picture sheds new insight on opportunities and risks associated with using digital technology during crises like the coronavirus pandemic.
When it comes to policy-making, we have identified elements of risk and opportunity specific to digital contract tracing, as well as areas of secondary risk and opportunity when considering the ecosystem of digital tools used to respond to the pandemic. These include equity and inclusion, legislation and regulation, and public internal versus non-governmental external innovation.
Equity and Inclusion
Equity and inclusion is frequently highlighted as a major concern with digital interventions in general, but especially digital contact tracing. About 55 percent of the world’s population does not own a smartphone, and many own smartphones that lack the up-to-date technology needed to download a contact-tracing application. Vulnerable groups are most frequently affected by this digital divide, including the elderly, communities of lower socio-economic status and people experiencing homelessness.
Some strides have been made to ensure digital contact tracing is more inclusive, yet exclusion results in other ways. Countries like New Zealand and China have used QR codes for contact tracing, as well as for quarantine management for private businesses and public services. New Zealand has adopted a voluntary approach to contact tracing where residents can choose to scan QR codes when they enter public spaces to aid centralized tracking of the virus, while also allowing themselves to be alerted of potential exposure to the virus. Although this measure excludes those who don’t own mobile phones, it lowers the barriers to engagement by removing the need to type and search for information, and more broadly, the need to download and configure an app. That said, there is little evidence to date regarding the effectiveness of New Zealand’s QR code approach.
Meanwhile in China, QR codes have triggered more widespread exclusion. All residents are assigned a health status based on their likelihood of infection — green means travel freely, yellow means report immediately, red means two-week quarantine — which they must share by scanning QR codes in public spaces. Depending on their health status, they are granted or denied access to these spaces or services, including public transportation. Serious concerns have been raised regarding the validity of the assigned health status, the level of digital intrusion required to generate this status, and the involuntary nature of forced QR scanning to access essential services.
Other digital interventions are helping to bridge digital divides. Telemedicine is reaching communities that have been previously disconnected. The integration of artificial intelligence algorithms into text messaging services, like WHO’s COVID chat, has enabled the synchronous, widespread, multilingual, direct communication of pandemic FAQs, self-diagnostics and myths — with a global audience. Audio messaging in various languages can reach individuals who are illiterate, as well as communities within a country who speak different languages. Remote diagnostic web platforms have been developed and implemented all over the world (from Pakistan to Kenya to Israel) to remotely connect patients to physicians through video chat. And in India, China and the United States, drones are being used to share important information about COVID-19 to the public; citizens do not need to own technology to hear these messages.
Legislation and Regulation
Approaches to the regulation of digital tools and interventions during COVID-19 have varied substantially based on perceived risk, civil concerns and political priorities. Switzerland enacted legislation to protect citizens’ privacy rights before it adopted a contact-tracing application. It had previously moved away from the PEPP-PT initiative, a collaboration between various European countries at the start of the pandemic to develop a centralized approach to digital contact tracing, due to privacy concerns, and instead developed its own application (the DP-3T initiative) before settling on the Google-Apple framework that other countries including Canada have adopted. Conversely, Australia passed legislation that addressed similar privacy concerns but did so after its digital contact-tracing application was implemented. This came after experts and government officials voiced concerns about the centralized nature of the application and how that would affect privacy.
In contexts where legislation has been put in place for other digital interventions, additional risks and opportunities emerge. Hungary, for instance, passed the “Coronavirus Protection Act,” which included a section that “imposes sanctions for disseminating false information or distortions that could undermine efforts to protect the population from the spread of the virus.” At a first glance, this seems like positive action to ensure people are obtaining accurate information (because there certainly are people actively trying to spread misinformation). There are concerns, however, that this is infringing on human rights, particularly freedom of expression and freedom of the press, given Hungary’s track record of silencing journalists and decimating independent media. In France, surveillance drones have been used to help enforce COVID-19 quarantine and confinement measures, yet civil rights groups claimed the data collection was breaking privacy laws. The French court recently decided in favour of those claims.
Public Internal vs. External Engagement
When it comes to innovation, governments often lack the capacity, resources and time to generate innovative solutions to complex problems. As a result, they rely on external engagement ranging from partnerships with big-tech companies, private consultancy firms and academia to crowd-sourced initiatives. Coupled with these opportunities for innovation, varied risks emerge related to ownership, reliability and decision-making power.
At the moment, digital contact tracing and exposure notification apps are widely dominated by the Google-Apple API. This approach offers the benefits of interoperability across regions as well as an established framework that preserves privacy. However, it has raised important questions about the role of private companies in a public health response. For example, the API is based on pre-established exposure settings, which means the tech companies effectively set the standards for how much exposure time and distance is required to register as “contact” with a potential COVID-19 carrier. This implies governments are effectively outsourcing these decisions to big tech. France and the United Kingdom, for instance, refrained from using the Google-Apple API because of concerns with this approach.
Beyond Google and Apple, governments are working with private companies and consultancy firms to develop digital contact-tracing solutions. For instance, the Canadian COVID Alert application is based on a framework developed by volunteers at Shopify and BlackBerry. These external partnerships run the risk of financial incentives overriding the desire to generate interventions that are widely scalable, simple and easy for governments to control. Partnerships with academia, like the Stanford Covid Watch app or the University of Cape Town’s COVI-ID App, or citizen-driven initiatives, as seen in Peru, may quell fears related to government surveillance and encourage solutions that are better tailored to their local contexts, which may render better user uptake and engagement. Yet other issues emerge regarding access to information, compliance with privacy laws, dependence on donor funding, fluidity in terms of who is developing and implementing these solutions, challenges related to liability and intellectual property rights, all of which point to broader sustainability risks.
Aside from contact tracing, other digital COVID management measures are being enacted through varied external partnerships, creating new innovation opportunities coupled with related and unique risks. Public-private initiatives frequently leverage AI companies. BlueDot, a Canadian AI company, has been producing metrics to enable the Canadian government to better understand where social distancing has been effective and where to deploy response resources. In France, AI startup DatakaLab created a mask detection software program that can be integrated into security cameras, and is now detecting whether people are wearing masks in the Paris metro system to gather data on mask adoption. AI offers tremendous capability to collect, monitor and analyze large amounts of data in real-time, yet there are inherent risks associated with the application of AI systems in crisis, including bias and discrimination; denial of individual autonomy, recourse and rights; and non-transparent, unexplainable or unjustifiable outcomes.
Next Steps
There are some initial actions policy-makers can take to maximize the opportunities of digital innovations while mitigating some of the associated risks. These include:
- Adopt a user-focused approach to innovation and implementation: Develop with user incentives in mind. For instance, for interventions that require active user engagement like downloading an app, carefully consider the factors that may incentivize users to not only download but also use the technology. Consider potential external risk factors, like stigmatization that may result if they are deemed COVID-positive, and work to manage this risk.
- Learn from and regulate (where needed) private-sector digital interventions: In terms of digital contact tracing, the majority of media focus is placed on governments’ digital contact-tracing apps, yet the private sector is developing interventions with varying levels of success, and in some cases, severe human rights implications. Universities, for example, are widely cited as requiring students to download and participate in contact-tracing measures and enforcing compliance. Other interventions, like the Google Maps COVID-19 layer, are being implemented without government regulation yet may expose a series of primary and secondary implications that, if not managed properly, could compound economic and marginalization risks.
- Adopt an ecosystem approach to digital innovation in crisis: There is potential to address digital divide risks with digital contact tracing by diversifying the digital approaches used to do so. Regions can adopt an ecosystem of diverse digital response technologies to enable more people to be included in the pandemic response, which means fewer people are likely to be forgotten and disproportionately impacted. The digital divide is not driven by technology itself – it is generated through the approach.
For further guidance on digital contact tracing, as well as deeper context of the digital response to COVID-19, the Digital Global Health and Humanitarianism Lab (DGHH Lab) is releasing two research series. The first provides in-depth descriptive understanding of digital contact tracing, social behaviour monitoring, and public communications and remote diagnostics and treatment. The second series looks at why countries experience variable user uptake of digital contact-tracing apps around the world. Both series will be released in the coming months. It is hoped that through the practitioner guidance, policy-makers, decision-makers and responders can gain more in-depth understanding of the factors that explain why user uptake rates vary, and learn from interventions that have worked elsewhere to enhance uptake, optimize the benefits and minimize the risks associated with using these apps.
Jennie Phillips is an academic-practitioner specializing in the intersection between society and digital technology in crisis. She is the founder and director of the Digital Global Health and Humanitarianism Lab (DGHH Lab), a partner with the Dahdaleh Institute for Global Health Research (DIGHR) and the Disaster & Emergency Management (DEM) program at York University. Rebecca Babcock is the research coordinator at the DGHH Lab and a researcher at the Dahdaleh Institute for Global Health Research, working on projects such as the Exploration of the Digital Response to COVID-19.