From: William R. Dolan, Jr. and Joseph S. Dumas (AIR)
When we started the Kommunedata (KMD) usability group, we used traditional laboratory testing almost exclusively as our working method. This was partly because we wanted to deviate from certain company traditions of user involvement, partly because testing seemed to be the epitome of usability work [4, 6]. For some time we have realized that laboratory tests alone are not sufficient for promoting qualitative changes to a product. No matter how well a test is performed, details of the product being tested always dominate the discussions: the daily life of the user becomes secondary, even though this is not the intention. Furthermore, tests tend to be carried out relatively late in the development process, and late testing reduces the possibility of accomplishing major product changes.
For this reason we have introduced new working methods that retain the advantages of traditional testing and at the same time extend users' participation. Instead of concentrating on the users testing the product, the perspective is widened by allowing more time to discuss users' work situations and by letting users debate how the product will fit into their workplaces.
KMD is Denmark's largest systems operator and software house. The company delivers a broad range of services such as computer systems operation, software development, consultancy services and facility management. Customers are primarily local and regional authorities. The company has 2,200 employees, 550 of these are developers. KMD has sites and centers throughout Denmark but usability work is concentrated in one location.
The usability lab was established in 1994 and quickly grew into a group of nine peoplea manager, a part-time secretary, and seven usability consultants, including myself. The consultants have varying educational backgrounds, but in order to maintain flexibility we do not specialize in specific areas and are therefore able to work with any of the company's products. We do not participate directly in actual development projects, but are engaged for limited periods as consultants.
The usability group has always emphasized fieldwork as a preparation for tests.
We performed 30 tests during our first year of existence, a level of activity that has since become the yearly average, and this has definitely helped us to gain a platform within the company. In order to maintain a high rate of testing, we developed a very formalized way of conducting the tests so that we could concentrate on understanding the users, their work, and the programs. We had a fixed plan for things to be done in preparation for tests; we always invited six test users, and test reports were written in three days in a standard format.
But our efficiency had its drawbacks. We were not good at adapting our work to the specific size and stage of development of the individual projects; all had to go through more or less the same procedure. Field testing was rarely carried out, since it took too much time to arrange and to set up the equipment at different locations. It was much easier to have the users come for traditional testing in our laboratory. Our biggest problem, however, was that the developers believed they had to present something neat and well-functioning if the users were to work with the prototype on their own while being videotaped. This meant that we did not get involved in development projects as early as we would have liked.
After the first rush to establish the department, we began to define new goals. We wanted to facilitate different types of cooperation with users so as to strengthen the focus on their working practices, and we wanted to offer more flexible assistance to the project teams. We believe the long-term result of these changes will be that usability consultants will be involved in development projects from the beginning. The first obvious result of these objectives is new working methods, which we collectively describe as "workshops."
The content of a workshop depends on the topic and the development stage of the software with which we are working. Results are best if workshops are held very early in the development process, since they are very good at uncovering information about users' work situations, and are always conducted in a very creative atmosphere. The participants are users, developers, and usability staff, and they usually spend a whole day together going through various activities. The main results of a workshop are wall posters with information on users' work practices and exhaustive comments on early design ideas.
The workshops all share important features that exemplify the new direction KMD's usability work is taking. First of all, we make active use of observation results. The usability group has always emphasized fieldwork as a preparation for tests; the professional work of our users is so complicated that we cannot reasonably run a test without having visited their workplaces. The developers, however, have not participated in field studies, and the insights we gain from site visits are used only to prepare test tasks. Workshops attempt to overcome this problem by starting with a discussion of the users' work situations based on the observation results. This is a very effective way of signaling to all participants that the users' everyday experiences are the central issuethere is no software to divert attention, and developers and usability people may only ask questions concerning users' work practices.
With workshops, we also have the advantage of having a group of people working together, instead of individuals or pairs as in traditional tests. Though we have never met a test person who did not find it interesting to do a traditional usability test, the workshops elicit a different kind of commitment. Users do not just spend their time criticizing a piece of software but, inspired by each other's comments, discuss their work in general much more broadly.
Also, developers are becoming more involved in planning and carrying out the arrangements. They are responsible for one or more of the activities, and they are able to converse directly with users instead of having usability consultants act as intermediaries. This means that the developers "own" the results of the day to a much greater extent than if they passively watch a test.
Another very positive aspect is the shared discussion of the workshop results. It means that the context in which the system will be used has come to play a much more prominent role, because users refer to their work-situation when debating preferable solutions. Furthermore, it gives the users a fair chance to evaluate the outcome of the day. In traditional usability tests the individual users cannot participate in the summing up in the same way, as it is impossible to come to any conclusions based on one test.
Finally, the workshops strengthen our entry into the early phases of the development process. Development teams that have previously had a program usability-tested in the traditional way freely acknowledge that a workshop based on a paper prototype provides useful new information about the users' work-situation, and leaves developers with much more time to change the system in accordance with user needs.
Traditional testing has been a good way to get the department accepted within the company; but as soon as the usefulness of our work was acknowledged, we decided to introduce new methods with more focus on users' work-situations. The usability group at KMD does not envisage a future where traditional usability tests are abolished. But they may end up playing a secondary role, as an activity to be performed after one or more workshops, to clarify whether the final electronic design actually works with users in general.
©1999 ACM 0002-0782/99/0500 $5.00
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, to republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee.
The Digital Library is published by the Association for Computing Machinery. Copyright © 1999 ACM, Inc.
No entries found