From: Mary Czerwinski and Michael Muller (Microsoft)
Usability testing has now become a well-known method in product development for examining new products before they are put in the hands of users. In practice, usability testing is often seen as a way to obtain objective data on the use of products for R&D departments. Due to this conception of objectivity (which stems from the tradition of psychological experiments), it has been a widespread belief that usability testing must be carried out by "neutral" usability specialists, that the user must think aloud in undisturbed solitude in the lab, and that R&D experts should be kept at a distance so as not to intimidate the user and color the results .
We have come to see these conceptions of usability testing as limited in scope, however useful they have been for providing data on usability problems and for generating usability awareness in the corporate setting. We have learned that traditional usability testing restricts understanding of problems and impedes productive dialogue between designers and users on use, context, and technology .
In this article we outline four examples to show how we have turned the conventional usability-testing format into a dialogue between users and designers.
Danfoss is Denmark's largest manufacturing company, with 18,000 employees worldwide. We produce components such as valves, sensors and controllers for refrigeration and heating systems, hydraulic machinery, and the like. Our products are mechanical components and electronic controllers with "solid user interfaces" . User interfaces are typically composed of a small liquid crystal display and a set of designated push buttons. Danfoss products are operated by such professionals as electricians, refrigeration mechanics, and plumbers. In 1990, man-machine interaction was made one of Danfoss's six core competencies, and activities were established in 1991 to improve the man-machine interaction (MMI) levelin line with those of comparable companies . A year later, we established our first usability lab, which now has expanded to three lab facilities. We have successfully adopted methods from the HCI communityamong these, usability testingto improve the usability of our products, even though our products are not computer applications .
Based on four cases, we will explain how our understanding of usability testing has emerged. To simplify matters, we will discuss five steps, even though the changes did not occur in this exact sequence.
We learned the value of video for convincing management that the company was, in fact, not very good at designing easy-to-use products.
We do this by running a training session a few days in advance, often with users from inside the company. At the end of the session, we ask the users to give their opinion about the processwhen they became frustrated and why. Afterward we use the video recording to recall communication problems between facilitators and users, and we discuss how to improve.
When expanding further to, say, eight users at a time, users are sufficiently confident to allow us to bring not just one but several members of the design team into close contact with them. We invite the users to full-day workshops in which they don't simply test products, but participate in design discussions with the designers . Our role changes from that of test facilitator into one of organizing a meaningful discussion between users and designers. Rather than evaluate results for the designers, we together with them use the video recordings later to recall significant snippets of user dialogue as a basis for discussing improvements to the design.
We increasingly allow users to formulate their own use scenarios, rather than dictating specific test scenarios invented from our limited knowledge of the users' world. To engage users in this process, we ask them to recall specific work situations they have experienced in their daily work livesthe type of equipment they worked with, what they intended, what they did, and so forth. Based on these stories, we build test scenarios in collaboration with the users, and try to go through the same actions with the new prototype.
Naturally, this makes heavy demands on the flexibility of the prototypes we employ. We therefore favor the use of low-fidelity paper prototypes, rather than computer simulations of user interfaces, to "buy information from the users" in the early design phases .
Today, we often join project teams in the product divisions of Danfoss as user interface designers, employing user participation throughout the process. So, in fact, the facilitator who steps between users and designers has by and large disappeared. Organizing user workshops has become a well-accepted activity, like organizing customer visits or planning design team seminars.
Based upon our experiences at Danfoss, we believe that three important problems can be overcome by turning usability testing into a dialogue with users. First, dialogue facilitates the disclosure of user priorities and practices that may otherwise remain concealed. Second, the problem of anchoring insights gained in the test setting in the R&D departments is easier to overcome if designers themselves engage in dialogue with users. And third, engaging the users in dialogue sessions enables us to move beyond product critique to a more innovative engagement in new design possibilities.
Our next step will be to move the design dialogue into the users' worldthe plant or the shop. By doing this, we hope to engage more of the users' tacit knowledge, and to make the users more confident with their new role in the design process.
©1999 ACM 0002-0782/99/0500 $5.00
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, to republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee.
The Digital Library is published by the Association for Computing Machinery. Copyright © 1999 ACM, Inc.
No entries found