I recently found myself in a military camp, as part of a forced "sabbatical" I took to complete my military service and fulfill my obligations toward the Hellenic Armed Forces. Unlike many colleagues who typically complete their military training before undertaking substantial amounts of research, I had the opportunity to complete 10 years of human-computer interaction research before being recruited for military service. In this column, I describe how this experience has made me reconsider the concept of training users on computer systems.
There are many details I found fascinating about the military's approach to training that prompted me to write this column. Throughout my training I could not help but draw parallels between how we were instructed and actually acquired new knowledge versus how in computer science and particularly human-computer interaction we advocate training end users either as part of an experiment or as part of a "newbie" training material. This is an especially appropriate comparison because the results of the military training were quite impressive in terms of the effect it had on trainees and the progress trainees made in a relatively short period of time.
The three-week basic training was a full-time crash course in military training. We received training on how to use weapons (shooting, maintenance, assembly, and disassembly), military behavior and code of conduct (salutation, insignia, and ranks), platoon activities (such as marching and arms activities), and guarding responsibilities (for example, learning the rules of engagement). Approximately 1,000 trainees lived in barracks of 70-bed platoons and effectively most of our time was spent in the presence of others, with very little privacy.
Having experienced the intensity of the basic training course, I have learned some lessons that could possibly apply in the context of training users.
One of my first notable experiences was feeling astonished at the simplicity of some of the activities we had to repeat over and over again. One such activity was learning how to salute, a one-second activity on which I estimate we spent approximately two hours each, or about three person-months in total. However, I soon realized why we had to repeat so many, apparently simplistic, activities: because of the great amount of diversity and variety of backgrounds amongst the group of trainees.
While I have conducted and taken part in many studies and experiments with "random" samples of participants, the group of trainees at the camp was by far the most diverse I have encountered. Interestingly, in terms of basic demographics the group was rather homogenous: all males, mostly between 18 and 25 years of age along with some up to their mid-30s. The true diversity, however, was in terms of socioeconomic background. Some were researchers, postdocs, Ph.D. candidates, and holders of master's degrees; others had their own business, were shop managers, or were employed in a variety of blue-collar jobs; some were farmers and some were priests; some were Facebook addicts while others had never owned a computer or mobile phone; some had yachts and some were unemployed. Finally, I met people who were completely illiterate and could not even write their own name.
It is through the prism of such diversity that I judged the results and achievements of training to be impressive. In a nutshell, the military training managed to establish and guarantee a bare minimum level of capability across every recruit. Each year, for thousands of extremely diverse young males, the military camp manages to produce soldiers who all meet certain criteria, who can handle weapons, and can conduct themselves in a military environment. This was achieved in the confines of a three-week course. Anecdotally, myself and some fellow computer scientists who got together during lunch used to joke that given the diversity of the people in our group it would be unlikely in such a short time to effectively train them all to use instant messaging, Facebook, or email with all their intricacies and idiosyncrasies. Yet, these same people were able to complete complex activities, both individually and in tightly timed groups, sometimes involving live ammunition and at the risk of injury or death.
There was a handful of teaching techniques that were used during our military training. The environment in which most learning took place was in groups. The vast majority of teaching and training took place in groups of varying sizes both indoor and outdoor. A further mechanism that was routinely used was to punish mistakes and only rarely reward good effort. The combination of group learning and punishment of mistakes was that, in fact, most people learned from others' mistakes. A typical example would be someone "getting in trouble" in front of everyone else because of inappropriate behavior, a wrong answer, or forgetfulness. Everyone would bear witness, and everyone would learn from that other person's mistake. An interesting consequence of this dynamic was that pretty soon all trainees started to check with each other, ask each other questions, and ask for help to avoid getting themselves in trouble. This reinforced learning and teaching between trainees, and strengthened the appropriation of "best" or "safe" practices without the direct intervention of instructors. Finally, a small portion of learning took place individually, but this was limited to reading handbooks and manuals that contained mostly theoretical information.
Having experienced the intensity of the basic training course, and currently serving in the Hellenic Air Force, I have learned some lessons that could possibly apply in the context of training users. While hopefully these lessons can lead to some hypotheses generation regarding the effectiveness and efficiency of user training, at the very least I expect these experiences can offer constructive input in forming and shaping user training strategies, courses, and material.
The practice of making no assumptions about trainees was something our trainers routinely did at military camp.
Make no assumptions about the users out there. One important detail to always keep in mind is that the people "out there" are extremely diverse, and come from a variety of backgrounds. Personally, this insight has challenged my assumption about user samples and the extent to which "common sense" exists. In trying to extrapolate the effectiveness of training material from a short pilot study, it helps to keep in mind there may be users out there who are nothing like the sample you have collected. It would then be helpful to reflect on whether those potential users are still of interest to you, or if they are beyond your target group. For instance, illiterate users may most likely be outside the target demographics of Facebook, but how about unemployed or homeless users? Perhaps it would be helpful to note that the practice of making no assumptions about trainees was something our trainers routinely did at military camp, usually with the effect of becoming extremely frustrating to most of the group. However, this practice helped ensure a minimum common denominator of knowledge and skills across all trainees.
Given the diversity of the group of trainees, military training techniques were quite effective in a very short period of time.
Learn from others' mistakes. In my experience, this was one of the most powerful mechanisms by which trainees at the military camp learned. At the same time, this is a mechanism that is rather straightforward to apply to user training: let them learn from other people's mistakes. For instance, to teach a new user the concept of "Saving" a working document, one approach may be to explain to them the mechanism and how it works. Another approach, however, would be to show them how other users have made mistakes, and what was the consequence of those mistakes. Similarly, teaching someone the difference between "Carbon Copy" and "Blind Carbon Copy" in email clients may be effectively done by explaining how not to do it, what kinds of mistakes people have made, and what the consequences were. I would even go as far as to suggest that in addition to the crash reports many applications submit on behalf of users, applications could also generate and submit "failure" reports, whereby the sequence of steps the user took (albeit intentionally or not) and the unwanted outcomes are recorded, transmitted, and reproduced for the perusal of other users and the community at large.
Help users to help each other. Providing an environment, such as a forum or community newsletter, where users can get together to exchange ideas and solve problems is not a novel idea, but my experience at training camp was subtly distinct from such mechanisms. What set it apart was the fact that people with the same set of problems got together to help each other. Typically in online forums as well as face-to-face situations, a novice user expects to receive help and guidance from an experienced user. My argument is that it may also be helpful to motivate newbies to learn from each other while they are getting acquainted with the system or application they are using. Since an online forum for newbies would not be interactive enough, one suggestion is to attempt to develop the equivalent of remote assistance, whereby the people offering assistance are also newbies. For instance, when a new person signs up to Facebook, provide the mechanisms so the new person can be teamed up with other novice users, so that together they can interact and learn from each other in exploring the system and its capabilities.
It can be argued that the differences between military training and training users of computer systems represent two completely different contexts, with different requirements and parameters. While that may be the case, I still feel that given the diversity of the group of trainees, military training techniques were quite effective in a very short period of time. The totality of trainees made incredible progress and managed to acquire a variety of skills, both individually and in groups. This unique experience has made me rethink the effectiveness of training computer users, and to hypothesize that some teaching techniques from the battlefield may be useful on the desktop.
The Digital Library is published by the Association for Computing Machinery. Copyright © 2012 ACM, Inc.
No entries found