Opinion
Computing Applications Viewpoint

Ethics as a Participatory and Iterative Process

Facilitating ethical reflection, inquiry, and deliberation.
Posted
  1. Introduction
  2. Doing Ethics
  3. Different Ethical Perspectives
  4. Methods
  5. References
  6. Author
compass with text labels for its four points marked V, C, D, and R

You probably find yourself more frequently in discussions of ethics in relation to the design and application of technology. The trolley problem is a familiar trope in such discussions. You assess and compare outcomes, and choose less-worse over even-worse outcomes. Another familiar trope is the ethics checklist. You make a list of relevant rules and norms, and take measures to comply. In this Viewpoint, I propose there is much more to ethics than assessment of outcomes and compliance to rules.

Back to Top

Doing Ethics

Over the years, and while working on many projects in the design and application of technologies, I have developed a view on ethics I would like to share. I understand ethics as a process, as doing ethics: a participatory and iterative process of ethical reflection, inquiry, and deliberation.9 The task for the people involved is then to make room for such a process, and to facilitate it. Practically, you can imagine three key ingredients in this process:

  • Identify issues at play in your project, for example, issues that can be problematic, and reflect on these. A handful of issues work best. If you have more, cluster them. If you have less, explore more.
  • Organize dialogues with the people involved, and with relevant stakeholders, both within and outside your organization, to inquire into these issues from diverse perspectives and to hear diverse voices.
  • Make decisions, and test these decisions in small-scale experiments, and be transparent about the results and be accountable. The key is to steer your project consciously and to act carefully.

The trick is to combine action and reflection. Action without reflection is clueless. Reflection without action is useless. Furthermore, it is worthwhile to go back and forth between zooming-out and zooming-in. If project team members tend to discuss all sorts of details, for example, about the user interface, you can invite them to zoom-out and question implicit assumptions or the underlying business model. Conversely, if they tend to discuss abstract concepts, for example, a value such as fairness, you can invite them to zoom-in and discuss how a specific user interface option might practically promote this value.

Back to Top

Different Ethical Perspectives

Moreover, we can mobilize different ethical perspectives in order to facilitate such a process of reflection, inquiry, and deliberation. Typically, four perspectives are used:12 consequentialism, deontology, relational ethics, and virtue ethics.

Consequentialism looks at the potential positive and negative consequences of a particular technology or solution (and it can involve thought experiments such as the trolley problem). Deontology, or duty ethics, looks at people's duties and rights, and can help to promote human autonomy and dignity in the design and application of technologies (and it can involve ethics checklists). Relational ethics understands people as fundamentally relational and interdependent;2 it can help to draw attention to how technologies shape how people interact and collaborate. Virtue ethics looks at people's abilities to cultivate virtues; it views technologies as tools people can use to flourish and to live well together.11

Each perspective has its particular benefits and limitations. It is therefore wise to combine them according to what the project requires. Some people like the image of a moral compass. In that case, you can imagine a compass with these four perspectives in the four directions. These four perspectives can help you to orient yourself in the moral situation you find yourself in, and to find a direction that you want or need to travel toward. Here, I provide one examples for each perspective.

Consequentialism. Imagine you work on software for autonomous vehicles. You can assess a range of pluses and minuses with some certainty and precision. It can be challenging, however, to set the boundaries of your analysis. Which people and which outcomes do you include? And which do you exclude? Do you include the pros and cons for pedestrians? And how do you weigh these, relative to the pros and cons for car owners? Moreover, there may be longer-term effects. Such questions are familiar to economists. They work with externalities: effects they choose to ignore. Sadly, externalities typically relate to costs. That is why many supply chains start in dirty mines in conflict zones or in sweatshops where people labor in unhealthy circumstances, and why many products end up in offshore dumps out of sight. Had these externalities been taken included, then production, consumption, and disposal would look very different. We can have a more sustainable economy. What we need is regulation that promotes including these externalities.

Deontology. Imagine working with a municipality on cameras in public spaces, with software to recognize vehicles' license plates and people's faces. What duties and right are at play? The municipality's duty is to promote citizens' safety, and also a duty to respect their rights to privacy. Often, in such cases, safety and privacy are framed as opposites. However, they do not have to be. One can build a system that promotes safety and respects privacy. Similarly to how one can design a tent that is both lightweight and spacious—if it has a lightweight fabric and flexible poles to make an igloo-shape volume. Moreover, in working with a government agency, one needs to think about: legitimacy, whether the system's goals are legitimate; effectiveness, whether this system can actually help to realize these goals; and proportionality, whether the harms, for example, the infringement on privacy, are proportional to the benefits the system offers. In addition, there are also questions about system boundaries: How far do our duties go; which rights are relevant?


Action without reflection is clueless. Reflection without action is useless.


Relational ethics. Let us look at what happens when people put 'smart' devices in their homes. How do these devices affect relationships and interactions between people? Do such devices help to improve these? Or do they corrode them, for example, because people look at their devices' screens, instead of engage in face-to-face communication? Typically, these devices are designed to grab and hold our attention. Using them corrodes our abilities to interact with others. Of course, this depends on how people use these devices. Here is a role for designers and developers. You can, for example, create a feature that invites people to be more aware of the time they spend with their devices. They can set a timer. And when the timer goes off, it reminds them of their intention to use their time wisely. Critically, this would require a different business model—not one based on grabbing people's attention and selling ads. Alternatively, the task of designers and developers could be to create devices and software that enables people to improve their abilities to interact with others.

Virtue ethics. Imagine you work on a social media app. A virtue ethics perspective can help to focus on the ways in which using this app can either foster or stunt people's abilities to cultivate specific virtues. For a social media app, this would involve virtues like self-control, empathy, and civility.11 Can people use the app to exercise self-control and pursue their own goals—instead of the other way around, where a company uses the app for their goals. Can people use the app to empathize with others, with what other people think and feel—or does it put them in a filter bubble or echo chamber, so that it fuels rage, mob behavior, and polarization? And can people use the app to engage in meaningful conversations with others about issues that matter to them, and work collectively toward some common good—this is what is meant by civility. A designer or developer with a focus on enabling people to cultivate and exercise such virtues will help create a very different app than a standard social media app.

Back to Top

Methods

If the situations described here seem sensible, you may wonder how to integrate such ethical reflection, inquiry, and deliberation in your projects. Fortunately, we can borrow methods from Human-Centered Design (HCD), Value Sensitive Design (VSD), and Responsible Innovation (RI).

A key element of HCD is the organizing of a participatory and iterative process, which puts people's needs center stage.3 Here, I focus on the organizing of iterations. We can use these iterations to move back and forth between problem setting and solution finding, and have dialogues about ethical issues.5 Problem setting would then also include questioning the problem, for example, discussing which concerns or consequences to include, and which to exclude. Solution finding would then include also exploring ways to take into account various, possibly conflicting, duties and rights. Crucially, project managers must make room for such iterations and facilitate such ethical reflection, inquiry, and deliberation.


We can mobilize different ethical perspectives in order to facilitate a process of reflection, inquiry, and deliberation.


From VSD we can borrow its focus on bringing different stakeholders around the table to learn about their values.1 Sometimes, however, talking about values can remain relatively abstract. In such cases, you can express values in terms of human capabilities. You can discuss how the innovation you work on can (or cannot) enable people to develop or extend relevant capabilities,7 such as: the capability to live with good health, or to work in a meaningful job.4 You can then design or modify the system or product so it can better enable people to develop relevant human capabilities, and thereby promote people to flourish and live well together.

Furthermore, we can learn from RI. Here, I focus on inclusion, one of its four key dimensions,10 which also includes participation and diversity. We face huge, complex, and global challenges, including the climate crisis, polarization, and inequalities. These are all wicked problems, which require diverse disciplines, both to better understand the problem, and to envision and create solutions. We need people with diverse backgrounds and types of expertise, and also participation of citizens or societal organizations.

Concerns for inclusion, participation, and diversity will also involve questions about power and fairness, notably in deciding whom to include, and whom to exclude, and challenges to engage in curiosity, creativity, and collaboration6—and avoid 'performative' approaches to inclusion.

Moreover, people with technology backgrounds are invited to develop a greater appreciation for the limits of their knowledge, and more openness to other people's expertise and views, notably of citizens. We can borrow methods from HCD, VSD, and RI, to bring people with different perspectives, concerns, and values together, and facilitate transdisciplinary collaboration between them.

In closing, let me remark that doing ethics is not always easy or pleasant; it can involve asking uneasy questions, creating awkward situations, and tolerating tension and uncertainty.8

    1. Friedman, B. and Hendry, D.G. Value Sensitive Design: Shaping Technology with Moral Imagination. MIT Press, Cambridge, MA, 2019.

    2. Held, V. The Ethics of Care: Personal, Political, and Global. Oxford University Press, NY, 2006.

    3. ISO 9241-210. Ergonomics of human-system interaction—Part 210: Human-centred design for interactive systems. ISO, Geneva, 2010.

    4. Nussbaum, M.C. Creating Capabilities: The Human Development Approach. Harvard University Press, Cambridge, MA, 2011.

    5. Steen, M. Co-design as a process of joint inquiry and imagination. Design Issues 29, 9 (2013), 16–29.

    6. Steen, M. Virtues in participatory design: Cooperation, curiosity, creativity, empowerment and reflexivity. Science and Engineering Ethics 19, 3 (2013), 945–962.

    7. Steen, M. Organizing design-for-wellbeing projects: Using the capability approach. Design Issues 32, 4 (2016), 4–15.

    8. Steen, M. Slow Innovation: The need for reflexivity in Responsible Innovation (RI). Journal of Responsible Innovation 8, 2 (2021); 254–260. doi: 10.1080/23299460.2021.1904346.

    9. Steen, M. Ethics for People Who Work in Tech. CRC Press, Boca Raton, FL, 2022; https://bit.ly/3JmzDjA

    10. Stilgoe, J. et al. Developing a framework for responsible innovation. Research Policy 42, 2 (2013), 1568–1580.

    11. Vallor, S. Technology and the Virtues: A Philosophical Guide to a Future Worth Wanting. Oxford University Press, New York, NY, 2016.

    12. Van de Poel, I. and Royakkers, L. Ethics, Technology, and Engineering: An Introduction. John Wiley and Sons, Chichester, U.K., 2011.

Join the Discussion (0)

Become a Member or Sign In to Post a Comment

The Latest from CACM

Shape the Future of Computing

ACM encourages its members to take a direct hand in shaping the future of the association. There are more ways than ever to get involved.

Get Involved

Communications of the ACM (CACM) is now a fully Open Access publication.

By opening CACM to the world, we hope to increase engagement among the broader computer science community and encourage non-members to discover the rich resources ACM has to offer.

Learn More