Research and Advances
Computing Applications

Replacing Usability Testing with User Dialogue

How a Danish manufacturing company enhanced itsproduct design process by supporting user participation.
Posted
  1. Introduction
  2. Usability Testing at Danfoss
  3. Five Steps on Our Path to User Dialogue
  4. References
  5. Authors
  6. Sidebar: Case Study: Usability Testing by the Book
  7. Sidebar: Can Everyone Become a Test Facilitator?
  8. Sidebar: Case Study: Observer or Participant?
  9. Sidebar: To: Jacob Buur and Kirsten Bagger (Danfoss)
    From: Mary Czerwinski and Michael Muller (Microsoft)
  10. Sidebar: Case Study: Preparing Users for a Design Dialogue

Usability testing has now become a well-known method in product development for examining new products before they are put in the hands of users. In practice, usability testing is often seen as a way to obtain objective data on the use of products for R&D departments. Due to this conception of objectivity (which stems from the tradition of psychological experiments), it has been a widespread belief that usability testing must be carried out by "neutral" usability specialists, that the user must think aloud in undisturbed solitude in the lab, and that R&D experts should be kept at a distance so as not to intimidate the user and color the results [5].

We have come to see these conceptions of usability testing as limited in scope, however useful they have been for providing data on usability problems and for generating usability awareness in the corporate setting. We have learned that traditional usability testing restricts understanding of problems and impedes productive dialogue between designers and users on use, context, and technology [1].

In this article we outline four examples to show how we have turned the conventional usability-testing format into a dialogue between users and designers.

Back to Top

Usability Testing at Danfoss

Danfoss is Denmark’s largest manufacturing company, with 18,000 employees worldwide. We produce components such as valves, sensors and controllers for refrigeration and heating systems, hydraulic machinery, and the like. Our products are mechanical components and electronic controllers with "solid user interfaces" [2]. User interfaces are typically composed of a small liquid crystal display and a set of designated push buttons. Danfoss products are operated by such professionals as electricians, refrigeration mechanics, and plumbers. In 1990, man-machine interaction was made one of Danfoss’s six core competencies, and activities were established in 1991 to improve the man-machine interaction (MMI) level—in line with those of comparable companies [9]. A year later, we established our first usability lab, which now has expanded to three lab facilities. We have successfully adopted methods from the HCI community—among these, usability testing—to improve the usability of our products, even though our products are not computer applications [4].

Back to Top

Five Steps on Our Path to User Dialogue

Based on four cases, we will explain how our understanding of usability testing has emerged. To simplify matters, we will discuss five steps, even though the changes did not occur in this exact sequence.

  • Moving the test facilitator into the lab. Traditional usability testing creates an artificial situation, with tension and nervousness on the part of the user. We realized we could relieve the atmosphere somewhat by making the test facilitator an active, attentive dialogue partner for the user. This meant moving facilitators into the lab alongside the users, rather than hiding them behind a one-way mirror, communicating only by intercom [8].
  • Developing video documentation procedures. In usability testing literature, video is suggested as a tool for observing user activity and for communicating findings to the R&D organization ("highlight tapes") [5]. This was how we employed video in early usability testing. However, we also learned the value of video for convincing management that the company was, in fact, not very good at designing easy-to-use products and that user-centered methods were paramount for success. Presenting the material to uninvolved staff introduced new requirements for video quality: we had to make sure that users were shown close-up in order to allow viewers to identify with them; audio and video quality had to be clear.
  • Training R&D staff to act as co-organizers. There is a widespread myth in usability testing that designers cannot be allowed anywhere near the users for fear of intimidating them with their preoccupation with their "pet" solution. On the contrary, we found that it seems much more crucial to deal with conveying test findings to the design team than with the notion of objectivity in the test. Our solution to this dilemma has been to educate R&D staff to act as test facilitators and observers. It is difficult for the designers to learn to "shut up and listen," but this is crucial if a company wants to move toward customer orientation.

We learned the value of video for convincing management that the company was, in fact, not very good at designing easy-to-use products.


We do this by running a training session a few days in advance, often with users from inside the company. At the end of the session, we ask the users to give their opinion about the process—when they became frustrated and why. Afterward we use the video recording to recall communication problems between facilitators and users, and we discuss how to improve.

  • Turning test sessions into workshops. In sessions with a single user, the user is not likely to volunteer much information beyond answers to the facilitator’s questions. This changes when users are invited to work in pairs. The traditional think-aloud format changes into "co-discovery learning," and the facilitator can concentrate on listening and observing.

When expanding further to, say, eight users at a time, users are sufficiently confident to allow us to bring not just one but several members of the design team into close contact with them. We invite the users to full-day workshops in which they don’t simply test products, but participate in design discussions with the designers [7]. Our role changes from that of test facilitator into one of organizing a meaningful discussion between users and designers. Rather than evaluate results for the designers, we together with them use the video recordings later to recall significant snippets of user dialogue as a basis for discussing improvements to the design.

  • Involving users in design. Another popular myth we have done away with is that "users cannot design for you." In line with Ehn [6] and others, we have found that users are a potential source of ideas, and that they readily come forth with opinions on design, provided that we can give them the media through which they can express and explore their ideas.

We increasingly allow users to formulate their own use scenarios, rather than dictating specific test scenarios invented from our limited knowledge of the users’ world. To engage users in this process, we ask them to recall specific work situations they have experienced in their daily work lives—the type of equipment they worked with, what they intended, what they did, and so forth. Based on these stories, we build test scenarios in collaboration with the users, and try to go through the same actions with the new prototype.

Naturally, this makes heavy demands on the flexibility of the prototypes we employ. We therefore favor the use of low-fidelity paper prototypes, rather than computer simulations of user interfaces, to "buy information from the users" in the early design phases [3].

Today, we often join project teams in the product divisions of Danfoss as user interface designers, employing user participation throughout the process. So, in fact, the facilitator who steps between users and designers has by and large disappeared. Organizing user workshops has become a well-accepted activity, like organizing customer visits or planning design team seminars.

Based upon our experiences at Danfoss, we believe that three important problems can be overcome by turning usability testing into a dialogue with users. First, dialogue facilitates the disclosure of user priorities and practices that may otherwise remain concealed. Second, the problem of anchoring insights gained in the test setting in the R&D departments is easier to overcome if designers themselves engage in dialogue with users. And third, engaging the users in dialogue sessions enables us to move beyond product critique to a more innovative engagement in new design possibilities.

Our next step will be to move the design dialogue into the users’ world—the plant or the shop. By doing this, we hope to engage more of the users’ tacit knowledge, and to make the users more confident with their new role in the design process.

Back to Top

Back to Top

Back to Top

Back to Top

Back to Top

Back to Top

Back to Top

    1. Binder, T. Designing for workplace learning. AI & Society, 9 Springer, 1995, 218–243.

    2. Black, A. and Buur, J. GUIs and SUIs: More of the same or something different? Info. Des. J. 8, 9, Elsevier Science, 1996.

    3. Buur, J. and Andreasen, M. M: Design models in mechatronic product development. Design Studies 10, 3, 1989.

    4. Buur, J. and Nielsen, P. Design for Usability: Adopting HCI methods for the design of mechanical products. Int. Conf. on Engineering Design (Prague), Heurista, 1995.

    5. Dumas, J. and Redish, J. A Practical Guide to Usability Testing. Ablex Publishing Corporation, 1993.

    6. Ehn, P. Work-oriented Design of Computer Artifacts. Arbetslivcentrum, 1988.

    7. Kyng, M. and Greenbaum, J. Design at Work: Cooperative Design of Computer Systems. Lawrence Erlbaum Associations, 1991.

    8. Rubin, J. Handbook of Usability Testing. John Wiley & Sons, 1994.

    9. Wiklund, M. Usability in Practice. AP Professional, 1994.

Join the Discussion (0)

Become a Member or Sign In to Post a Comment

The Latest from CACM

Shape the Future of Computing

ACM encourages its members to take a direct hand in shaping the future of the association. There are more ways than ever to get involved.

Get Involved

Communications of the ACM (CACM) is now a fully Open Access publication.

By opening CACM to the world, we hope to increase engagement among the broader computer science community and encourage non-members to discover the rich resources ACM has to offer.

Learn More