We propose a novel representation of motion data and control of virtual characters that gives highly agile responses to user input and allows a natural handling of arbitrary external disturbances. In contrast to traditional approaches based on replaying segments of motion data directly, our representation organizes samples of motion data into a high-dimensional generalization of a vector field that we call a motion field. Our runtime motion synthesis mechanism freely flows through the motion field in response to user commands. The motions we create appear natural, are highly responsive to real-time user input, and are not explicitly specified in the data.
Whenever a video game contains a character that walks or runs, it requires some method for interactively synthesizing this locomotion. This synthesis is more involved than it might at first appear, since it requires both the creation of visually accurate results and the ability to interactively control which motions are generated. The standard techniques for achieving this create realistic animation by directly (or nearly directly) replaying prerecorded clips of animation. They provide control by carefully specifying when it is possible to transition from playing one clip of animation to another. The synthesized motions are thus restricted to closely match the prerecorded animations.
No entries found