Sign In

Communications of the ACM

Human-computer etiquette: managing expectations with intentional agents

Etiquette and the Design of Educational Technology


View as: Print Mobile App ACM Digital Library Full Text (PDF) Share: Send by email Share on reddit Share on StumbleUpon Share on Hacker News Share on Tweeter Share on Facebook

Educators have always emphasized good manners and etiquette, both in their own behavior and in attempting to inculcate good manners in their students. However, the etiquette of human-computer interaction (HCI) has not been of much concern to designers of educational technology, who typically consider computers as cognitively powerful but socially neutral tools. The presumed anti-social nature of computers was often argued as a key advantage for pedagogical purposes. Moreover, it has been argued that learners would find computer personalities implausible, scripted, stiff, unnatural, and lacking the vitality and dynamism of human personalities.

Recently, however, researchers and developers working on educational tools have been paying more attention to issues of etiquette. There appear to be two reasons for this change. First, evidence from the Computers As Social Actors (CASA) research paradigm [11] provided empirical evidence that, given the mere perception of agency, people often respond naturally to computers as they would to real people and events. This has been summarized as Topffer's law, which states, "All interfaces, however badly developed, have personality" [8]. This undermines the idea that computers are merely neutral tools and emphasizes the importance of the social relationship that can and will develop between a computer and a learner.

The second reason for an increased interest in etiquette has been recent work on interactive software agents. All software agents, (including Embodied Conversational AgentsECAsdescribed by Timothy Bickmore in this section), utilize advances in natural language processing, affective computing, and multimodal interfaces to develop believable, anthropomorphic entities (see the sidebar "Pedagogical Software Agents"). The motivation to make use of these agents, with their explicit inclusion of social modalities, in pedagogical applications has forced researchers to grapple with issues of what is appropriate behavior on the part of a software agent.

Back to Top

Etiquette in Educational Technology

Educational technology is a broad field and incorporates many different kinds of interactions where etiquette can play a role: between students, between students and teachers and between the student and the computer. Our focus here is the interaction between student and computer, that is, the etiquette of educational HCI.

There are some fundamental differences between general HCI etiquette and etiquette for educational HCI. Etiquette considerations in educational technology are complicated by the fact that learning from a computer is not just about ease of use. Learning can be frustrating and difficult, particularly when it exposes learners' errors in thinking and gaps in knowledge and forces them to grapple with difficult subject matter. In an educational context, ease of use may be subservient to larger goals of learning content or subject matter, monitoring the learner's prior and developing knowledge, while maintaining a focus on issues of motivation and affect.

For example, consider receiving unsolicited help from a computer system. Making help systems useful and available on demand by inferring user needs has been an important goal of HCI researchers. However, in an educational context, help is not necessarily perceived as being good. Research has shown that students make complex attributions based on whether or not they receive help. For instance, students receiving unsolicited help may be perceived as less capable than students who did not [1]. A computer program offering help may be exhibiting generally appropriate HCI etiquette, but inappropriate educational etiquette.

Any discussion of etiquette in educational technology must contend with the wide variety of purposes for learning. One can learn in order to acquire factual knowledge; develop understanding; improve performance and skill; acquire strategies and techniques; improve self-esteem; enjoy and have fun; persuade people to act in certain ways; and inspire learners and motivate them to action.

Learning also plays out in widely different contexts, which constrain or determine the roles played by the computer and the learner. To take an extreme example, we have very different expectations of a drill instructor than of an elementary school teacher. These contexts (and the concomitant roles associated with them) dictate the range of behaviors considered appropriate as well as communicate shared goals and expectations to both parties. Crucial pedagogical issues such as who gets to direct conversation topics, who gets to raise questions and when, are often implicitly embedded within these contexts of activity. Since etiquette is highly context-dependent, what may be appropriate educational etiquette in one situation may be inappropriate in another. Educational etiquette is dependent upon how these relationships are understood and instantiated in the design of the interface. The larger context within which education is situated can also make a difference in how etiquette is considered. Issues of culture, ethnicity, and gender can play a significant role in determining appropriate and inappropriate etiquette as well.

We presented a set of different goal/context/role patterns [7] within which educational technologies and humans may function (such as computers as tutors, as "tutees," as tools for productivity, as tools for exploration, and as tools for assessment). Clearly different learning technologies can straddle across two or more of these categories. For instance, a simulation of frog dissection could be used as a tutorial and as an arena for exploration. What is important to note here is that each of these contexts implicitly assumes a particular set of values and beliefs about teaching and learning and the roles of the learner and the computer. For instance a tutorial system, such as those used for Computer-Aided Instruction (CAI), is driven much more by goals of the tutorial system than the intentions of the student. An example of such a system is Cardiac Tutor [12] that helped students learn an established medical procedure through directed practice. Interaction in Cardiac Tutor is initiated by the tutor, and is controlled by it, providing feedback as and when needed. In contrast to this are open-ended exploratory systems such as complex simulations or games. These systems are driven more by the learner's interests than those of the software program. For instance, the game "Civilization" allows users to seek advice from expert "advisors" though it is up to the user whether or not to follow the advice. Developing smarter tools for learning requires getting a better understanding of these situated practices and implicit commitments, as well as the requirements, preferences, and background knowledge of the learner.

Back to Top

Integrating Etiquette in Educational Tools: Where Do We Begin?

The multiple goals/contexts/role patterns within which educational technology functions makes determining how to integrate principles of etiquette into such systems challenge. One simple rule of thumb (indeed, one which Reeves and Nass's 1996 CASA paradigm encourages us to follow) is to apply what has been found appropriate for human-human interaction (HHI) to the design of HCI. To understand how this could work we look at three different studies that attempt to apply HHI etiquette rules to HCI.

Personalized messages from agents and computer systems. We know that in most contexts personalizing conversations by addressing people by name is good etiquette. Not doing so makes the conversation stilted and formal (which, ironically, is true of most computer messages). Moreno and Mayer [9] conducted a study that looked at whether changing a pedagogical software agent's language style (personalized dialogue versus neutral monologue) would affect student learning. They found that students who learned by communicating with a pedagogic agent through personalized dialogue were able to recall more information and were better able to use what they have learned to solve problems, than students who learned via a neutral message. Clearly this is a case where a simple rule of HHI etiquette carried over to the HCI case as well.

Affective feedback from computer systems. Research has shown that praise and blame feedback from teachers can have complicated and paradoxical effects. Praise (and criticism) can be interpreted in many different ways and these interpretations (depending on the perceived difficulty of the task, innate sense of ability of the student, and their success and failure at completing the task relative to other students) can influence how the recipient responds to the feedback. Henderlong and Lepper [2] examined studies that show, for instance, that being praised for success in a task perceived as easy may have a negative effect on a student's self-confidence while being blamed for failing a task perceived as difficult may actually lead to a positive effect. The design of feedback in educational technology systems is often based on a simplistic (and erroneous) framework where praise is assumed to affect behavior positively irrespective of context. We conducted an experimental study [6] where participants received differential affective feedback (praise or blame) after success at an easy task or failure at a difficult task. We measured the effect of this feedback on the participants' motivation and self-perception of ability.

This study, framed within the CASA paradigm, replicated an HHI study [5] except that feedback was provided by computers (albeit via a simple textual interface) and not by humans. The results demonstrated students preferred praise from the computer and found it more motivating, irrespective of the difficulty of the task and their success at it. The fact that students accepted praise from the computer indicates they did at some level respond psychologically to it. However, the fact that their responses did not fully match the HHI experimental results indicates there are limits to what they considered appropriate or acceptable feedback from the computer (at least as it was presented in this study). We argue this may be because the participants did not engage in the same level of "deep psychological processing" about intentionality as they do with human respondents. Of course, one of the implications of ECA work is that a richer, more fully embodied agent might have altered these responses.

Humor and HCI. Humor plays a very important role in HHIincluding teachingas a way in which etiquette problems are resolved without conflict. Morkes, Kernal, and Nass [10] conducted a pair of experiments in which they looked at the effects of humor in a computer-mediated communication (CMC) task. In one case the participants were told they were interacting with another human being, in the other they were told they were interacting with a computer. In both cases the participants were provided preprogrammed comments. Set up as a direct test of the CASA hypothesis, the experiment found that though the results between the two groups were generally consistent, the participants in the HCI condition were less sociable, demonstrated less smiling and laughing behavior, felt less similar to their interaction partner, and spent less time on the task.

The results of the last two studies indicate there is validity to the CASA paradigm. For instance, participants in the study did respond to affective feedback from the computer or did smile at the humor exhibited by the computer. However, they also indicate the psychological aspects of HCI are complex and difficult to explain using simplistic frameworks such as "computers are neutral tools" or "interacting with computers is just the same as interacting with humans."

Clearly computers are not humans and the current state of technology does not allow us to be consistently fooled into thinking they are. But even if we could fool people into believing computers were sentient agents, it could be ethically problematic to do so. Indeed, there may be pragmatic reasons why computers should not become too social. For instance, certain characteristics of computer systems (such as consistency, adaptability, inability to take offense or be bored) can be pedagogically valuable. An emphasis on etiquette and enhancing sociability in our systems should not blind us to these advantages. Thus, by adhering to the CASA philosophy we run the risk of not only having the computer appear artificial and/or stupid, but of actually undermining the positive attributes that computers currently possess.

Back to Top

Conclusion

There has been a slow but growing realization on the part of the educational technology research community that designers of educational tools must go beyond the purely cognitive aspects of working with computers and factor in the social and psychological aspects as well. The design of appropriate etiquette in educational systems requires adding an additional layer to the issues of traditional interest to HCI researchers and developers. Etiquette is closely connected to contexts of activity and practiceto the goal/contexts/role patterns that structure interactions in a domain. A better understanding of these patterns [7] is essential to building tact and courtesy into our computer programs.

We also need to learn from existing research, particularly that of teacher behavior and its effect on student learning and motivation. For instance, a review of the literature on nonverbal behavior indicates that eye contact, gestures, vocal inflections, body movement, and combinations of nonverbal behaviors conveying enthusiasm, animation, and variation of stimuli can positively affect student motivation, attention, teacher ratings, immediate recall, and achievement [4]. This research can be of great utility in the design of appropriate behaviors for pedagogical agents. However, as the three studies described here suggest, we must be careful not to apply these findings to the HCI context indiscriminately. This strategy, though easy to follow, may not always be the most appropriate. Instead, pedagogical etiquette in educational software must be carefully crafted based on sound empirical research sensitive to the complexities of learning and human psychology.

Back to Top

References

1. Graham, S., and Barker, G. The downside of help: An attributional-developmental analysis of helping behavior as a low ability cue. J. Educational Psychology 82 (1990) 187194.

2. Henderlong, J., and Lepper, M.R. The effects of praise on children's intrinsic motivation: A review and synthesis. Psychological Bulletin 128 (2002), 774795.

3. Johnson, W.L., Rickel, J.W., and Lester, J.C. Animated pedagogical agents: Face-to-face interaction in interactive learning environments. International J. AI in Education 11 (2000), 4778.

4. Klingzing, H.G., and Tisher, T.P. Expressive nonverbal behaviors; A review of research on training with consequent recommendations for teacher education. Advances in Teacher Education, Vol. 2. J.D. Raths and L.G. Katz, Eds. Ablex Publishing, 89133, 1986.

5. Meyer, W.U., Mittag, W., and Engler, U. Some effects of praise and blame on perceived ability and affect. Social Cognition 4, 3 (1986), 293308.

6. Mishra, P. Affective feedback and its effect on perceived ability and affect: A test of the Computers as Social Actors Hypothesis. Submitted to the 2004 Annual Conference of the American Educational Research Assoc.

7. Mishra, P. and Hershey, K. A framework for designing etiquette for educational technology. In Proceedings of the AAAI Fall 2002 Symposium on Etiquette in Human-Computer Work. AAAI Press, Washington, DC.

8. Mishra, P., Nicholson, M., and Wojcikiewicz, S. Does my word processor have a personality? Topffer's law and educational technology. J. Adolescent and Adult Literacy 44, 7 (20012003), 634641.

9. Moreno, R., and Mayer, R.E. Engaging students in active learning: The case for personalized multimedia messages. J. Educational Psychology 92, 4 (2000), 724733.

10. Morkes, J., Kernal, H., and Nass, C. Effects of humor in task-oriented human-computer interaction and computer-mediated communication: A direct test of SRCT theory. Human-Computer Interaction 14, 4 (1999), 395435.

11. Reeves, B., and Nass, C. The Media Equation: How People Treat Computers, Television, and New Media Like Real People and Places. Cambridge University Press/CSLI, NY, 1996

12. Woolf, B.P., Beck. J., Eliot, C., and Stern, M. Growth and maturity of intelligent tutoring systems: A status report. Smart Machines in Education: The Coming Revolution in Educational Technology. K.D. Forbus and P.J. Feltorich, Eds. AAAI/MIT Press, Metro Park, CA, 2001.

Back to Top

Authors

Punya Mishra (punya@msu.edu) is an assistant professor in the Learning, Technology, and Culture Program, College of Education, Michigan State University, East Lansing, MI.

Kathryn A. Hershey (hersheyk@msu.edu) is a doctoral student in the Learning, Technology, and Culture Program, College of Education, and a research associate of MIND Lab, at Michigan State University, East Lansing, MI.

Back to Top

F1-1Figure 1. STEVE (Soar Training Expert for Virtual Environments) developed by the Center for Advanced Research in Technology for Education(CARTE). (Image © University of Southern California)

F1-2Figure 2. AutoTutor, developed by Tutoring Research Group, University of Memphis. (Image © University of Michigan)

F1-3Figure 3. Betty's Brain, developed by the Teachable Agents Group at Vanderbilt University. (Image © Vanderbilt University)


©2004 ACM  0002-0782/04/0400  $5.00

Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, to republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee.

The Digital Library is published by the Association for Computing Machinery. Copyright © 2004 ACM, Inc.


 

No entries found