User:Pdenbrook/sandbox/Pointman

Pointman™ is a seated user interface for controlling one's avatar in a 3D virtual environment. It combines head tracking, a gamepad, and sliding foot pedals to provide positional control over many aspects of the avatar's posture. Pointman was developed by the US Naval Research Laboratory (NRL) to support the use of dismounted infantry simulation for USMC training and mission rehearsal. NRL's goal in developing Pointman was to extend the range and precision of actions supported by virtual simulators, to better represent what infantrymen can do.

The Challenge of Dismounted Infantry Simulation
A virtual simulator is a system that uses input devices, computer simulation, and sensory feedback to model a user's interaction with the real world. In the military, virtual simulators have been effectively used for training the pilots and crews of combat vehicles (aircraft, ships and ground vehicles), by exposing trainees to a wider variety of situations than they would typically encounter in real world training, and in safer and more controlled environments. The user interface for a vehicle-based simulator (such as a flight simulator) is typically straightforward since a similar set of control devices can be used in simulation as are used in the actual vehicle.

Dismounted infantry simulation poses a much greater challenge. An infantryman is directly exposed to his environment, rather than sequestered inside a vehicle. He moves through the environment propelled by his own limbs and must make the most of his senses, directing them to spot the enemy before he is detected. He is not heavily armored and must move tactically with respect to cover and concealment to minimize his exposure and maximize his situational awareness. People move differently than vehicles and change their posture in complex ways.

Varieties of Realism in Virtual Simulation Interfaces
The user of a dismounted simulation controls an avatar that represents his body in the virtual world. The user interface allows the user to control his avatar and sense the virtual world from the avatar's perspective. As an avatar controller, Pointman strives for behavioral realism, in which the user controls the avatar to act in simulation as he would in real life. This is distinct from physical realism, in which the avatar faithfully reflects the user's physical actions. Behavioral realism allows the user interface to substitute control actions for natural actions, while still providing a high level of facility in controlling the avatar to act as the user would in the real world. While physical realism is required for training sensory-motor skills, such as marksmanship training, behavioral realism is sufficient for training cognitive skills, such as tactical decision-making and team coordination.

Importance of Behavioral Realism for Training Tactical Decision-Making
The effectiveness of a dismounted infantry simulator for training tactical decision-making hinges on how well the actions carried out by infantrymen in virtual simulation match the actions they would have performed in the real world. This is because tactical decisions unfold in simulation. In training exercises as in real world missions, decision-making occurs within an OODA loop, an ongoing process of observe, orient, decide, and act. A decision's sequence is judged by the outcome of the actions it gave rise to, and the realism with which actions are simulated determines the accuracy of the simulated outcome of tactical decisions.

In a virtual simulator used for training tactical decision-making, behavioral realism is required to distinguish good and bad decisions. If in simulation the trainees are unable to control their avatars to a high degree of fidelity, they will discount the tactical mistakes they make and lose confidence in the results of the simulation. On the other hand, the more relevant behaviors the interface allows a trainee to perform, the more responsible he becomes for carrying out his duties. A more behaviorally realistic interface thus encourages the user to take it more seriously, leading to more realistic performance and more meaningful training exercises.

As an example, consider the task of employing a squad to set up and execute an ambush. In a real life situation the team exposes only 15% of their bodies to incoming fire and is able to return effective fire, whereas in a crude simulation the team must expose at least 50% of their bodies (due to discrete rather than continuous control over crouching and leaning the torso) or lose the ability to return fire. If in a training exercise the team members are unable to control their avatars to effectively use cover and concealment, the squad may suffer substantially more casualties than in a comparable real world engagement. The team leader might then conclude that he chose a bad ambush site or deployed his forces in the wrong formation, when in fact the failure was not due to these decisions but to the limitations of the user interface. Moreover, since the OODA loop is cyclical, errors in representing user actions cascade over time, diverging further from reality as the mission plays out.

Actions Involved with Infantry Tactics
The actions involved with infantry tactics are often divided along the lines of Look-Move-Shoot. Looking entails moving the head and eyes to direct the view. An infantryman must freely look around to detect and engage threats in a 360°/180° battlefield. Movement involves turning the body, and stepping or crawling to translate the body in any direction. Tactical movements such as pie-ing a corner require the infantryman to turn his body to face potential threats while simultaneously moving along a path. Shooting encompasses the full range of weapon manipulations, from holding, stowing and reloading the weapon to firing it. The Look-Move-Shoot paradigm also applies to the use of cover and concealment. Looking out from behind cover, moving to quickly take cover, and shooting from behind cover are critical for making effective use of fire while minimizing exposure to enemy fire.

In a virtual simulator used for training tactical decision-making, the ability of the user to control his avatar to look, move, shoot and make use of cover will affect how his tactical decisions play out in simulation. Conventional seated interfaces employing a mouse-keyboard or gamepad provide some level of control over these actions, but are hampered by their limited number and range of input channels due to being operated exclusively by the hands.

How Pointman Controls an Infantry Avatar
Pointman seeks to enhance the level of behavioral control provided by conventional desktop and console game controllers by engaging the user's whole body to control corresponding segments of the avatar's body. The user employs his head and upper body to control looking and aiming, as well as leaning to duck and peek around cover. He uses his hands to operate virtual weapons and direct tactical movement, and he uses his feet for stepping and controlling his avatar's postural height.Pointman uses a set of three consumer grade input devices - a head tracker, gamepad and flight simulator foot pedals. The additional input from the head and feet offloads the hands from having to control the entire avatar and allows for a more natural assignment of control. Together, the three input devices offer twelve independent channels of control over the avatar's posture.

The Natural Point TrackIR™ 5 head tracker registers the translation and rotation of the user's head. Pointman uses these inputs to map the movements of the user's head and torso one-for-one to those of his avatar. The virtual view changes as the user turns his head to look around or leans his torso in any direction. When the weapon is raised into an aim position its sights remain centered in the field of view, so that turning the head also adjusts the aim. The user can aim as precisely as he can hold his head on target. Hunching the head down by flexing the spine is also registered by the head's translation, and the avatar adopts a matching posture. Leaning forward and hunching are used to duck behind cover. Rising up and leaning to the side are used to look out and shoot from behind cover.

The Sony DualShock® 3 gamepad includes dual thumb sticks and a pair of tilt sensors. Pointman uses the thumb sticks to turn the avatar's body and set the stepping direction. The tilt of the gamepad is mapped to control how the virtual rifle is held. The user tilts the gamepad down to lower the rifle, and tilts the gamepad up to continuously raise the rifle up through a low ready into an aim and then to a high ready. This allows users to practice muzzle discipline, by lowering the rifle to avoid muzzle sweeping friendlies, minimizing collisions when moving through tight spaces, or leading with the rifle when moving around cover. Once the rifle is raised into an aim position, the user's head motion aligns the sight picture. The user rolls the gamepad (tilting it side to side) to cant the weapon. Gamepad buttons are mapped to control various weapon operations (including firing and reloading) and aiming functions (such as the optic zoom level).

The CH Products Pro Pedals slide back and forth and also move up and down like accelerator pedals. Pointman uses these inputs to control the avatar's lower body. The translational sliding (apart then together) is mapped to control the separation of the avatar's legs, simulating stepping when the avatar is upright and crawling when the avatar is prone. This allows users to take precise, measured steps when moving around obstacles or cover, and to continuously vary their speed over a realistic range of walking, running and crawling gaits. The up-down movement of the pedals is mapped to control the avatar's postural height via the flexing of the avatar's legs. This allows the avatar to continuously transition from standing tall to a low crouch (or kneel when the legs are apart), and when prone from hands-and-knees to belly-on-the-ground. The ability to precisely control their avatar's postural height allows users to make better use of cover and concealment, and to look and shoot out from behind cover while minimizing their exposure.

Integration with the Virtual Battlespace Combined Arms Simulator
The Virtual Battlespace combined arms simulators (VBS®2 and its upcoming successor VBS®3), from Bohemia Interactive Simulations (BIS), are used for training by the USMC, the US Army, and a number of the NATO armed forces. BIS worked closely with NRL to tightly integrate the Pointman interface with VBS, and to allow Pointman to control the posture of the user's avatar on a continuous basis. The detailed articulation of the user's avatar is made visible to other squad members running in a networked simulation. Pointman-enhanced VBS (VBS-Pointman) supports the operation of a wide range of small arms and additional forms of mobility, including climbing, swimming, and mounted roles (driver, passenger and gunner) using the full complement of manned vehicles.

Development and Assessment
Pointman was originally designed by Dr. James N. Templeman of NRL and implemented by Patricia Denbrook of DCS, Inc. NRL's Base Funding Program supported the initial development, and the Office of Naval Research (ONR) Rapid Technology Transition program office supported the integration of Pointman with VBS2. ONR's Human Performance, Training, and Education Thrust Area added its support in refining, demonstrating and assessing Pointman.

A Military Utility Assessment (MUA) of Pointman integrated with VBS2 was performed by the MEC (MarForPac Experimentation Center) at MCB Hawaii in September 2011. The squad of Marines that participated in the study (Golf Company, 2nd Battalion, 34d Marine Regiment) gave Pointman high marks for realism and usability. In response to a series of survey questions, the Marines felt Pointman allowed them to realistically: control viewing, perform tactical movements, control the virtual rifle, utilize cover, and control the avatar’s posture. They found it comfortable, easy to use, and that it enhanced the simulation. The primary recommendation of the MUA report was: “Transition the Pointman DISI (dismounted infantry simulation interface) enhancements into VBS2 to increase realism and efficacy as a virtual training aid.”

Future Enhancements
NRL is continuing to develop the Pointman interface as part of its ongoing research in expressive interaction for desktop simulation. This involves extending Pointman to include non-verbal communications (such as eye movements, facial expression, and arm gestures) needed to support team and cross-cultural interaction, without limiting tactical mobility. A driving application is the training of cultural interaction skills alongside warfighting skills, using training scenarios which pose a mix of tactical, cultural and ethical challenges.