In addition, ARES demonstrated a lower perceived workload score, as measured using the NASA Task Load Index (NASA-TLX) ratings, compared to the traditional sand table.
[4] The ARES project was one of the 25 ARL initiatives in development from 1995 to 2015 that focused on visualizing spatial data on virtual or sand table interfaces.
[1] Collaborations involved with ARES included Dignitas Technologies, Design Interactive (DI), the University of Central Florida's Institute for Simulation and Training, and the U.S. Military Academy at West Point.
[6] ARES was largely designed to be a tangible user interface (TUI), in which digital information can be manipulated using physical objects such as a person's hand.
It was constructed using commercial off-the-shelf components, including a projector, a laptop, an LCD monitor, Microsoft's Xbox Kinect sensor, and government-developed ARES software.