Human perception is a subtle and complex process and increasingly is viewed as a major component of expert performance. In sharp contrast, in the representation of human behavior within a computer generated force, perception is often reduced merely to an assignment statement. Thus an enemy vehicle might be "perceived" in a constructive simulation by passing variables representing an entity-id, a vehicle-type enumeration and an x-y-z location, given to several significant figures, to a finite state machine which can then "cogitate," using those values to determine which of a known set of alternative actions should be taken (e.g., assigning a threat-level to the enemy vehicle, sorting it into a target list, choosing an action on contact etc.). In this proposal we describe research that will allow us to imbue synthetic entities with a richer sense of perception. The proposed research will revolve around two technical objectives. The first is to understand human perception in a given task domain, and to survey candidate computational mechanisms for performing analogous transformations of ground truth into "perception." The second is to explore how perception influences infantry decision making and to determine a corresponding division of labor on the computational side between the perceptual and inferential