Augmented Reality Sandtable

The Augmented Reality Sandtable (ARES) is an interactive, digital sand table that uses augmented reality (AR) technology to create a 3D battlespace map. It was developed by the Human Research and Engineering Directorate (HRED) at the Army Research Laboratory (ARL) to combine the positive aspects of traditional military sand tables with the latest digital technologies to better support soldier training and offer new possibilities of learning. It uses a projector to display a topographical map on top of the sand in a regular sandbox as well as a motion sensor that keeps track of changes in the layout of the sand to appropriately adjust the computer-generated terrain display.

An ARL study conducted in 2017 with 52 active duty military personnel (36 males and 16 females) found that the participants who used ARES spent less time setting up the table compared to participants who used a traditional sand table. In addition, ARES demonstrated a lower perceived workload score, as measured using the NASA Task Load Index (NASA-TLX) ratings, compared to the traditional sand table. However, there was no significant difference in post-knowledge test scores in recreating the visual map.

Development
The ARES project was one of the 25 ARL initiatives in development from 1995 to 2015 that focused on visualizing spatial data on virtual or sand table interfaces. It was developed by HRED's Simulation and Training Technology Center (STTC) with Charles Amburn as the principal investigator. Collaborations involved with ARES included Dignitas Technologies, Design Interactive (DI), the University of Central Florida's Institute for Simulation and Training, and the U.S. Military Academy at West Point.

ARES was largely designed to be a tangible user interface (TUI), in which digital information can be manipulated using physical objects such as a person's hand. It was constructed using commercial off-the-shelf components, including a projector, a laptop, an LCD monitor, Microsoft's Xbox Kinect sensor, and government-developed ARES software. With the projector and Kinect sensor both facing down on the surface of the sandbox, the projector provides a digital overlay over the sand and the Kinect sensor scans the surface of the map to detect any user gestures inside the boundaries of the sandbox.

During development, researchers explored the possibility of incorporating ideas such as multi-touch surfaces, 3D holographic displays, and virtual environments. However, budget restrictions limited the implementation of such ideas.

In September 2014 during the Modern Day Marine exhibition in Quantico, Virginia, researchers from ARL showcased ARES for the first time.

Uses
According to a 2015 technical report by ARL scientists, ARES is reported to have the following capabilities.
 * Images, maps, and videos can be projected into the sand table from a top-down point of view in real time.
 * Terrain and scenarios created in ARES can be imported into different simulation applications, such as Virtual Battlespace 3 (VBS3) and One Semi-Automated Forces (OneSAF).
 * Military symbols and graphics can be created, labeled, and placed onto the sand table to set up different scenarios.
 * Visual aids such as color schemes and contour lines can be used to guide users to shaping the sand to replicate a previously saved 3D terrain.
 * Users can use their hand to navigate the different menus available in ARES as if it was the mouse.
 * The ARES sensor can detect and track the presence of hand(s) to identify where the user points on the sand table.
 * A web camera can be used to communicate with other users and provide a top-down view of the sand table to collaborators.
 * AR-based tablet apps and specially-made note cards can be used to project images of different vehicles on the terrain.