Research and Advances
Artificial Intelligence and Machine Learning A game experience in every application

Making a Game of System Design

The pleasures and wonders of gameplay, as well as their deeper lessons, are being applied in fields beyond entertainment as diverse as psychological therapy, experience-based education, and design prototyping.
Posted
  1. Introduction
  2. Future Vision
  3. Conclusion
  4. References
  5. Authors
  6. Footnotes
  7. Figures

Could the ideas and development methods that make computer and video games so successful as a compelling user experience and commercial market also be applied to developing relatively serious-minded applications? There is no denying the appeal of computer and video games. In 2002, they accounted for $6.9 billion in sales in the U.S. alone, according to the Interactive Digital Software Association [3]. Their magnetic effect on children’s attention is all too familiar to parents, particularly when the alternatives are homework and household chores. But what is it about games that makes them so appealing? And what lessons might be learned from their construction that could be applied to other applications? Here, we explore these questions, offering several examples of how computer game ideas influence the architecture of systems not developed directly for entertainment.

For conventional software, design is usually driven by a specification or set of requirements. In game design, the driving force is the user’s experience. Game designers try to imagine what players will experience as they work their way through the game, trying to deliver the most exciting and compelling experience possible; for example, in the recently released computer game Age of Mythology, Ensemble Studios’ designers began by calculating how long each game session should take. When the game clock exceeds this “session time limit” the quality of the game’s built-in AI is dynamically scaled down, making the game easier to win, so the experience “doesn’t drag out” [5].

Two key aspects of the players’ experience are the goals they pursue and the environment in which they pursue them. Game designers often seek to keep players engaged by creating three levels of goals: short-term (collect the magic keys), lasting, perhaps, seconds; medium-term (open the enchanted safe), lasting minutes; and finally, long-term (save the world), lasting the length of the game. The interplay of these levels, with the support of the environment, is crafted to draw players into the storyline of the game. A good story is not simply a sequence of things that happen but a carefully constructed tapestry in which events are juxtaposed and emotions peak and ebb. Designers purposely engage players’ emotions as a way to immerse them in their games (see Whitton’s article in this section).

A good game is also highly interactive, deliberately generating tension between the degree of control the story imposes and the player’s freedom of interaction. With no story and complete freedom of interaction, players do whatever they want, but their experience can be boring. On the other hand, if the story provides too much control, the experience becomes more like watching a movie than playing a game. Balancing these two extremes is helped by the fact that a good storyline provides a strong context that actually limits the options a player might consider; they are the only ones the game needs to allow. Cleverly exploiting narrative to shape the players’ experience, game designers give players the perception they have free will, even though at any time their options are actually quite limited.

Another (big) difference in game design is that, whereas most conventional software is designed to function in the real world, game software operates in an artificial game world designed entirely by its developers. Software that operates in the real world must handle all possible contingencies. Designers of such software have no control over the real world, so their only option is to develop a complete solution. Game developers, on the other hand, design the world in which their software runs. It can be simplified or made more complex, depending on the desired game-playing experience. Parts of the world may be modeled in great detail for a rich experience, while other parts are only sketched out if they are intended to have little effect on the user’s experience. Such flexibility allows game developers to focus resources where they provide the greatest benefit, yielding faster and less-costly development.

These ideas have helped developers create immersive, compelling games in a cost-effective manner. But the goal of a game is entertainment. Could the same ideas be applied to applications with more serious-minded goals? We have identified two general classes of application that could benefit from a game-based design approach. The first we call experience-based systems. In game design, the main focus is the user’s experience. In identifying other applications that might benefit from a game-design approach, one set of applications to consider would be those that seek to influence users by putting them through some sort of experience. Such applications could involve education and training, communication and persuasion, and experience-based therapy. The second class involves using a game-based approach to construct a testbed for emerging technologies. Games employ a developer-designed world that may reflect the real world while also simplifying it in many ways. A game world makes it possible to test emerging technologies in a comparatively rich environment before they are ready for the full-scale complexities of the real world. The game world can also provide an environment in which a number of technologies are integrated together while revealing interdependencies and emerging research issues.

Experience-based systems. All good computer and video games, from the simplest puzzle to the most complex strategy adventure, share the ability to entice players and immerse them in the game experience. Traditionally, immersion and entertainment go hand in hand. Players are more willing to suspend their disbelief for entertaining games—the deeper the immersion the more entertaining the experience. Game designers thus craft every aspect of the players’ experience to support the desired effect and avoid breaking their sense of immersion; for example, simple scripted virtual characters that always behave believably are more desirable than complex autonomous characters that occasionally make stupid mistakes, thus breaking the sense of immersion.

Immersion is a powerful shortcut into users’ minds with potential non-game uses. In educational applications, studies have shown that an immersive learning experience “creates a profound sense of motivation and concentration conducive to mastering complex, abstract material” [1]. Interactive, immersive experience is also a powerful tool for communication and persuasion. Moreover, mental health professionals have begun immersing patients in virtual experiences to treat phobias and other mental disorders.

Experience-based education. One example of a computer “game” built from scratch as an educational tool is Full Spectrum Command (FSC), a training system developed by the University of Southern California’s Institute for Creative Technologies (ICT) and Quicksilver Software for the U.S. Army. It draws on the real-time strategy game genre to teach cognitive skills, including decision making, synchronization, and leadership, to light infantry company commanders. Visually, it seems to users more like a commercial computer game than a training simulation with real-time 3D graphics and a first-person-perspective play style. However, the development team’s own gaming background had a much greater influence on the system than mere visual appearance.


Cleverly exploiting narrative to shape the players’ experience, game designers give players the perception they have free will, even though at any time their options are actually quite limited.


FSC’s training objectives emphasize combat in urban terrain. As a consequence, the developers modeled the urban sections in the map in much more detail than the rural areas (compare the area around the buildings with the open areas in Figure 1). This is in contrast to many conventional training simulators that use a uniform terrain representation. As a result, the time spent commanding troops in urban combat is a far more complex, realistic, and challenging experience than the time spent approaching the village through the woods. Similarly, the AI and combat physics simulations are carefully tailored to be complex only where necessary to support the training experience; for example, the AI behaves realistically in the game’s urban environment but is not general enough to handle all possible environments.

As with most real-time strategy games, FSC consists of a sequence of missions, each designed to support a specific training objective. Traditional military simulations are also structured around instructor-designed missions. However, these missions include extensive background stories with details on the history of the situation and enemy personalities illustrated with fictional images and profiles. Not only does this context help immerse student officers in the training environment, the twists and turns of each mission’s narrative are designed to support the mission’s training objective. In early missions, when students learn the fundamentals, stories are fairly straightforward. Later missions, seeking to challenge more advanced students, include many unexpected twists and third-act surprises that test their ability to react quickly and keep a cool head under escalating pressure.

Experience-based communication. While educational systems seek to impart knowledge or skills to users, experience-based communication systems present them with a specific viewpoint in hopes of influencing their beliefs and attitudes. Like Sinclair Lewis’s 1906 novel The Jungle, which detailed slaughterhouse conditions and resulted in extensive workplace safety and food-hygiene reforms, experience-based systems present their users with the developer’s viewpoint under the guise of entertainment. However, unlike a novel, experience-based systems communicate that viewpoint from an interactive perspective.

One of the earliest examples of this fairly new application area is America’s Army, a “strategic communication” game developed by the U.S. Military Academy’s Office of Economic and Manpower Analysis [9]. America’s Army is a “first-person shooter” game, based on the popular Unreal Tournament game engine from Epic Games, that seeks to inform players about the Army’s core values and support recruitment. Like the training aids, America’s Army is designed to communicate the desired message, including the consequences of good and bad behavior. The aspects of the game that appeal to the target audience (such as firing weapons and anti-terrorism missions) are central to the experience and modeled in detail. It also includes the most realistic weapon models of any game we’ve seen, including the need to reload, weapons that jam, and snipers timing shots to their own (simulated) breathing. Aspects of Army life that may not appeal to the audience (such as a strict command hierarchy) are not emphasized.

Experience-based therapy. In addition to education and communication, experience-based systems are also starting to be used in psychological therapy for such disorders as phobias and post-traumatic stress [2, 8]. In exposure therapy, patients are immersed in virtual environments and exposed to anxiety-producing situations. The fidelity of the patients’ experience is carefully controlled to increase realism as they become less sensitive. So, in early sessions a patient who fears flying might watch a cartoonish representation of a plane flight on a computer monitor. In later sessions the same patient might wear a head-mounted virtual reality display and earphones to be immersed in a photorealistic environment. It is easy to imagine employing a storyline along with these environments to help immerse patients and make the experience seem increasingly real.

An example of a system that uses narrative extensively is Carmen’s Bright IDEAS [4], which uses interactive pedagogical drama to help mothers of children with cancer cope with the stress and turmoil such a disease can introduce into family life. In this system, a user first learns the backstory of Carmen, the mother of a pediatric cancer patient. Next, the user observes as Carmen discusses her concerns and problems with a simulated clinical counselor. The user can influence Carmen’s thinking by clicking on thought bubbles, like those in Figure 2. The drama unfolds based on that interaction. The counselor discusses coping strategies with Carmen, thereby showing them to users so they can apply them in their own situations.

Like all powerful tools, the experience-based design approach must be applied carefully. Without a carefully designed experience and extensive testing, these systems could easily result in unwanted outcomes (such as negative training or increased phobia anxiety). Despite the promise of the early efforts, the best approaches to designing these experiences is still a topic of research and debate.

Back to Top

Future Vision

A game’s synthetic world and compelling scenario can provide an insulating cocoon for testing new and emerging technologies before they are ready for real-world applications. The key idea is that instead of introducing new technologies into the complexity of the real world, the designers instead create an artificial “game” world and introduce people and technologies into that world. The fact that the artificial world is under the control of developers changes everything, because its content, as well as its scenario, can be manipulated to create a strong context limiting the range of possibilities the technology must support.

Researchers in natural language processing have recognized for some time that systems operating in the context of a particular task are much more feasible, as the task itself provides constraints on the kinds of interactions that might occur. Adding the elements that make games compelling—storyline, engaging setting, and constrained environment—carries this notion a step further.

The Mission Rehearsal Exercise system. A project that illustrates these ideas well is the Mission Rehearsal Exercise system, also developed at ICT [6, 7]. Its goal is to train U.S. Army lieutenants in crisis decision making in a variety of situations that occur in real-world peace-keeping and disaster-relief missions. Presented on a 30-foot-by-8-foot curved screen, the system places trainees in a simulated crisis (see Figure 3). They interact with life-size virtual humans playing the roles of local civilians, friendly forces, and hostile forces. These virtual humans use AI to understand what the lieutenant is saying, reason about the situation, respond with appropriate speech and gesture, and model and exhibit emotions. Although the characters follow an overall scenario, their reactions are not scripted; instead, they respond dynamically to the actions of trainees and events as they unfold in the environment.


Simulating reality is an approach that may or may not be useful in creating a believable experience.


The scenario currently in use is situated in a small town in Bosnia. It opens with a lieutenant (the trainee) in his Humvee. He receives orders via radio to proceed to a rendezvous point to meet up with his soldiers and plan a mission to assist in quelling a civil disturbance. When he arrives at the rendezvous point, he is surprised to find one of his platoon’s Humvees has been involved in an accident with a civilian car. A small boy is on the ground with serious injuries; his frantic mother is nearby, and a crowd is starting to gather. What should the lieutenant do? Stop and render aid? Proceed with the mission? Different decisions produce different outcomes.

To support such a scenario, a range of technologies must be integrated into the virtual humans, including speech recognition, natural language understanding, dialogue management, natural language generation, speech synthesis, gesture generation, emotion modeling, and task reasoning. Supporting them all is a daunting task made easier by the strong context provided by the scenario, which limits the range of responses trainees are likely to make and in turn limits the size of the knowledge base needed to support the virtual humans’ behaviors. For example, when confronted with the accident scene, trainees are likely to ask what happened or about the health of the injured child; they are unlikely to engage in casual conversation about, say, recent soccer scores.

Using the same scenario, with a commercial speech recognizer and a simpler natural language understanding system, Kevin Knight, a research faculty member a USC and his graduate students Michael Laszlo and Rebecca Rees have taken this idea a step further. They use the structure of the storyline to help recover in the face of natural language failure. For example, at one point in the scenario, the appropriate next step for the lieutenant to take is to secure the accident site. If the natural language understanding system doesn’t understand what the lieutenant says, the platoon sergeant (one of the virtual humans) might suggest that they secure the site. The natural language problem is now easier, because it is focused on recognizing a confirmation or rejection of this suggestion rather than a larger set of possible next moves.

If the system still fails to understand, it relaxes the criterion for recognition. In a test with 12 subjects, each was able to complete the scenario, even though the natural language recognition accuracy was only about 65% and the utterance classification was only about 70%. Thus, this approach allows the system’s designers to integrate technologies and get a feeling for how a mature version of the system will operate before all the research is complete or the individual components are fully robust.

Back to Top

Conclusion

To highlight the differences between conventional and game systems, it is useful to consider simulations, the class of system most closely related to games. In such a system, the ultimate goal is to create a virtual duplicate of reality for analysis, training, experimentation, or other purposes. In a game, the goal is to create a compelling experience for the player. Simulating reality is an approach that may or may not be useful in creating that experience. This distinction yields several consequences. In simulations, behavior (of, say, objects, vehicles, and people) should be as realistic as possible. In games, behavior needs to be believable and designed to support the desired experience.

In simulations, the structure of the user’s goals (such as “attack the bridge”) mimics real-life goals. In games, goals (such as “slay the dragon”) are selected and designed to increase and maintain involvement. In simulations, the representation of the terrain and environment tends to be uniform and consistent, allowing the user to act freely within that environment. In games, players have the illusion of freedom while following a designed experience; the designers vary the fidelity of the representation, devoting their greatest effort to the parts most coupled to the users’ experience.

Exploiting these design tricks, computer games create a compelling experience for practically any user willing to play. Many of the resulting features in games can be applied to relatively more serious-minded applications. For applications seeking to teach users through realistic experience, game design techniques can make the experience much more memorable. In a testbed environment, the context and control afforded by game design techniques allow integration of technologies and evaluation of the overall experience, even with partial implementations. Perhaps it’s time to take the lessons of game design seriously.

Back to Top

Back to Top

Back to Top

Back to Top

Figures

F1 Figure 1. Screenshot from Full Spectrum Command highlights embedded paths.

F2 Figure 2. Thought bubbles in Carmen’s Bright IDEAS (USC/ISI).

F3 Figure 3. The Mission Rehearsal Exercise system.

Back to top

    1. Dede, C., Salzman, M., and Loftin, B. The development of a virtual world for learning Newtonian mechanics. In Multimedia, Hypermedia, and Virtual Reality, P. Brusilovsky, P. Kommers, and N. Streitz, Eds. Springer Verlag, Berlin, 1996.

    2. Hodges, L., Rothbaum, B., Kooper, R., Opdyke, D., Meyer, T., North, M., de Graff, J., and Williford, J. Virtual environments for treating the fear of heights. IEEE Comput. 28, 7 (July 1995), 27–34.

    3. Interactive Digital Software Association; see www.idsa.com/ 1_27_2003.html.

    4. Marsella, S., Johnson, W., and LaBore, C. Interactive pedagogical drama for health interventions. In Proceedings of the 11th International Conference on Artificial Intelligence in Education (Sydney, Australia, July 20–24, 2003).

    5. Pottinger, D. The future of game AI. Game Develop. Mag. 7, 8 (Aug. 2000), 36–39.

    6. Rickel, J., Marsella, S., Gratch, J., Hill, R., Traum, D., and Swartout, W. Toward a new generation of virtual humans for interactive experiences. IEEE Intelli. Syst. 17, 4 (July/Aug. 2002), 32–38.

    7. Swartout, W., Hill, R., Gratch, J., Johnson, W., Kyriakakis, C., Labore, K., Lindheim, R., Marsella, S., Miraglia, D., Moore, B., Morie, J., Rickel, J., Thiebaux, M., Tuch, L., Whitney, R., and Douglas, J. Toward the Holodeck: Integrating graphics, sound, character, and story. In Proceedings of the 5th International Conference on Autonomous Agents (Montreal, Canada, May 28–June 1). ACM Press, New York, 2001, 409–416.

    8. Zimand, E., Anderson, P., Gershon, G., Graap, K., Hodges, L., and Rothbaum, B. Virtual reality therapy: Innovative treatment for anxiety disorders. Primary Psychiatry 9, 7 (2002), 51–54.

    9. Zyda, M., Hiles, J., Mayberry, A., Wardynski, C., Capps, M., Osborn, B., Shilling, R., Robaszewski, M., and Davis, M. The MOVES Institute's Army Game Project: Entertainment R&D for defense. IEEE Comput. Graph. Appli. 23, 1 (Jan./Feb. 2003), 28–36.

    The technology described here was developed with funds from the U.S. Department of the Army under contract number DAAD 19-99-D-0046. Any opinions, findings, and conclusions or recommendations expressed here are those of the authors and do not necessarily reflect the views of the U.S. Department of the Army.

Join the Discussion (0)

Become a Member or Sign In to Post a Comment

The Latest from CACM

Shape the Future of Computing

ACM encourages its members to take a direct hand in shaping the future of the association. There are more ways than ever to get involved.

Get Involved

Communications of the ACM (CACM) is now a fully Open Access publication.

By opening CACM to the world, we hope to increase engagement among the broader computer science community and encourage non-members to discover the rich resources ACM has to offer.

Learn More