By Solon Barocas, Asia J. Biega, Margarita Boyarskaya, Kate Crawford, Hal Daumé III, Miroslav Dudík, Benjamin Fish, Mary L. Gray, Brent Hecht, Alexandra Olteanu, Forough Poursabzi-Sangdeh, Luke Stark, Jennifer Wortman Vaughan, Hanna Wallach, Marion Zepf
Communications of the ACM,
Vol. 64 No. 7, Pages 30-32
The COVID-19 pandemic has both created and exacerbated a series of cascading and interrelated crises whose impacts continue to reverberate. From the immediate effects on people's health to the pressures on healthcare systems and mass unemployment, millions of people are suffering. For many of us who work in the digital technology industry, our first impulse may be to devise technological solutions to what we perceive as the most urgent problems when faced by crises such as these. Although the desire to put our expertise to good use is laudable, technological solutions that fail to consider broader social, political, and economic contexts can have unintended consequences, undermining their efficacy and even harming the very communities that they are intended to help.10 To ensure our contributions achieve their intended results without causing inadvertent harm, we must think carefully about which projects we work on, how we should go about working on them, and with whom such work should be done. In this column, we offer a series of guidelines for navigating these choices. As current and former members of the Fairness, Accountability, Transparency, and Ethics (FATE) group at Microsoft Research, we have been working actively on the ethical and societal impacts of technologies such as artificial intelligence since 2016. While we originally developed these guidelines to help our colleagues at Microsoft respond to the first wave of the pandemic in the spring of 2020, we believe they are general enough that their value extends beyond Microsoft and beyond projects focused on the COVID-19 pandemic.
Ask yourself if your project is worth pursuing. Before investing in your project, do a risk-benefit analysis.8 Are there other responses (technological or otherwise) that would have a greater impact with fewer potential downsides? This is an important question to ask when trying to address problems that are more societal than technological in nature. Depending on the answer, proceeding with your project may not be the right decision after all.
Question your assumptions about contexts. At the start of your project, question the assumptions you are making about the social, political, and economic contexts in which your technological response will take place. For example, consider a contact-tracing project based on cell-phones.5 Does it assume: everyone will have access to the same digital tools, such as smartphones; people are willing to share personal information;3 people understand the risks of doing so and are comfortable accepting those risks; people will be able to give truly voluntary consent if the response is adopted by employers, schools, or governments; there is widely available testing; people have sufficient financial resources and social support to self-quarantine; and people can afford medical care? In other words, ask what other institutional processes and structures need to be in place for your technological response to work effectively.
Collaborate with experts in other disciplines. Recognize the limitations of your own expertise. For many of us who work in the technology industry, it is easy to assume that technological responses, such as tracking people's locations, collecting information about their contacts, and issuing "immunity" passports, are clearly worth pursuing. Yet the usefulness of such approaches is contested by both public health and privacy experts.7 In many cases, you can have the greatest impact by finding experts who know more than you do about a problem, asking them what they need to make progress, and then helping them accomplish their goals.
Be clear about expected benefits and beneficiaries. Think carefully about what your project specifically offers, how it will be beneficial, and whether those benefits will be widely accessible to those who need them. In many cases, the intended beneficiaries may not be on a level playing field. For example, the enormous racial differences in health outcomes observed during the pandemic illustrate how existing societal inequalities affect who suffers and in what ways.6 Does your project take these dynamics into account and work to mitigate them?
If your project relies on data, ask where that data came from and how it was collected.
Work with and for communities. Ask the intended beneficiaries of your project—whether they are healthcare workers, public health experts, or senior citizens—if your project addresses their needs. If you believe that you have additional insights, have you presented them with evidence that supports your beliefs and asked them for their input? You should provide opportunities for communities to collaboratively shape the project, give ongoing feedback and voice their concerns, and make informed decisions for themselves.
Mitigate risks. Try to anticipate the risks posed by your project and how such risks might impact different communities. For example, new technologies to facilitate working from home will also provide new opportunities for companies to track and monitor workers. In many cases, the communities that are most vulnerable to the pandemic are also those that are most at risk of being harmed by technological responses.4
Understand and protect your data. Many technological responses to the pandemic either rely on or collect data, including data about people's health and locations. If your project relies on data, ask where that data came from and how it was collected. Did you consider the unusual circumstances under which the data might have been generated?2 What are the resulting limitations, if any? Does the data capture what you need or what you think it captures? Does it reflect a representative sample of the relevant population (for example, the intended beneficiaries)? Does the data involve restrictive, problematic, or harmful classifications, such as only binary genders? If your project collects data, ask whether this will pose risks, perhaps resulting from unanticipated uses or abuse. For example, when correlated with other data, the data collected by a contact-tracing project could be used to identify and persecute undocumented immigrants. Failure to guard against these risks will limit people's willingness to rely on your project and may undermine the solidarity needed to maintain public health. Consequently, privacy and security must be paramount.1
Have a plan for when and how your project will end. A crucial—yet commonly overlooked—feature of any project is a plan for when and how it will end. If you decide to proceed with your project, ask how long it should last. When you have an answer, you can then plan for a "graceful dismantling."9 If you are not able to devise such a plan, you should reconsider your decision to proceed. For example, systems built to support contact tracing during the pandemic can be repurposed for other goals or kept in place even after the pandemic is over. Your plan should therefore include controls—technological, legal, or otherwise—that enable you to limit the functionality of your project to its intended purpose for the desired duration. Your plan should also address what will be done with the data when your project is over. Equally, people may come to depend on your project: Will ending it harm the intended beneficiaries? If so, how will you guard against this?
Following these guidelines can make it more likely that projects achieve their goals, while minimizing harm—helping their intended beneficiaries and other communities, without putting them at risk.
1. Brill, J. and Lee, P. Preserving privacy while addressing COVID-19. Microsoft On The Issues. (2020); https://bit.ly/33QKspB
2. Crawford, K. and Finn, M. The limits of crisis data: Analytical and ethical challenges of using social and mobile data to understand disasters. GeoJournal 80, 4 (2015), 491–502; https://bit.ly/3tWv3i0
3. Langford, J. Critical issues in digital contract tracing. Machine Learning Theory. (2020); https://bit.ly/3eS5nPo
4. National Research Council U.S. Panel on Monitoring the Social Impact of the AIDS Epidemic. The practice of public health. In A.R. Jonsen and J. Stryker, Eds. The Social Impact Of AIDS In The United States, Washington, D.C., 1993; https://bit.ly/3wirxjV
6. Owen, W.F. Jr, Carmona, R., and Pomeroy, C. Failing another national stress test on health disparities. JAMA 323, 19 (2020), 1905–1902; https://bit.ly/3eQWoOt
7. Soltani, A., Calo, R., and Bergstrom, C. Contact-tracing apps are not a solution to the COVID-19 crisis. Brookings TechStream. (2020); https://brook.gs/3bwhmjO
8. The National Commission for the Protection of Human Subjects of Biomedical and Behavioral Research. The Belmont Report: Ethical Principles and Guidelines for the Protection of Human Subjects of Research. Department of Health, Education, and Welfare (Apr. 18, 1979); https://bit.ly/3opJKsR
Mary L. Gray (firstname.lastname@example.org) is Senior Principal Researcher, Microsoft Research, New England/Faculty Associate, the Berkman Klein Center for Internet and Society Harvard University/Associate Professor, Luddy School of Informatics, Computing and Engineering, Indiana University, Bloomington, IN, USA.
Brent Hecht (Brent.Hecht@microsoft.com) is Director of Applied Science, Experiences and Devices, Microsoft/Associate Professor, Human-Centered AI and Spatial Computing, Northwestern University, Evanston, IL, USA.