The Augmented REality Sandtable (ARES) - DTIC

Transcription

ARL-SR-0340 OCT 2015US Army Research LaboratoryThe Augmented REality Sandtable (ARES)by Charles R Amburn, Nathan L Vey, Michael W Boyce, andMAJ Jerry R MizeApproved for public release; distribution is unlimited.

NOTICESDisclaimersThe findings in this report are not to be construed as an official Department of theArmy position unless so designated by other authorized documents.Citation of manufacturer’s or trade names does not constitute an officialendorsement or approval of the use thereof.Destroy this report when it is no longer needed. Do not return it to the originator.

ARL-SR-0340 OCT 2015US Army Research LaboratoryThe Augmented REality Sandtable (ARES)by Charles R Amburn, Nathan L Vey, and MAJ Jerry R MizeHuman Research and Engineering Directorate, ARLMichael W BoyceOak Ridge Associated Universities, Oak Ridge, TNApproved for public release; distribution is unlimited.

Form ApprovedOMB No. 0704-0188REPORT DOCUMENTATION PAGEPublic reporting burden for this collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining thedata needed, and completing and reviewing the collection information. Send comments regarding this burden estimate or any other aspect of this collection of information, including suggestions for reducing theburden, to Department of Defense, Washington Headquarters Services, Directorate for Information Operations and Reports (0704-0188), 1215 Jefferson Davis Highway, Suite 1204, Arlington, VA 22202-4302.Respondents should be aware that notwithstanding any other provision of law, no person shall be subject to any penalty for failing to comply with a collection of information if it does not display a currently validOMB control number.PLEASE DO NOT RETURN YOUR FORM TO THE ABOVE ADDRESS.1. REPORT DATE (DD-MM-YYYY)2. REPORT TYPE3. DATES COVERED (From - To)October 2015Special ReportMay 2014–September 20154. TITLE AND SUBTITLE5a. CONTRACT NUMBERThe Augmented REality Sandtable (ARES)5b. GRANT NUMBER5c. PROGRAM ELEMENT NUMBER6. AUTHOR(S)5d. PROJECT NUMBERCharles R Amburn, Nathan L Vey, Michael W Boyce, and MAJ Jerry R Mize5e. TASK NUMBER5f. WORK UNIT NUMBER7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES)8. PERFORMING ORGANIZATION REPORT NUMBERUS Army Research LaboratoryATTN: RDRL-HRT-MAberdeen Proving Ground, MD 21005ARL-SR-03409. SPONSORING/MONITORING AGENCY NAME(S) AND ADDRESS(ES)10. SPONSOR/MONITOR'S ACRONYM(S)11. SPONSOR/MONITOR'S REPORT NUMBER(S)12. DISTRIBUTION/AVAILABILITY STATEMENTApproved for public release; distribution is unlimited.13. SUPPLEMENTARY NOTES14. ABSTRACTThe Augmented REality Sandtable (ARES) is a research testbed that uses commercial off-the-shelf products to create a lowcost method of geospatial terrain visualization with a tangible user interface which can be used for simulation and training. Theprojection technology combined with a Microsoft Kinect sensor and a laptop is intended to provide an enhancement to traditionalmilitary sand tables. This report discusses the development of the system, its place among previous related work, and researchmethodology/experimentation efforts to assess impacts on human performance. It also provides an explanation of current,ongoing, and future research questions and capabilities. It discusses current collaborations and key leader engagements up tothis point. The goal of this report is to provide a resource for researchers and potential collaborators to learn more about ARESand the opportunity to use its service-oriented architecture for the development of content for specific domains.15. SUBJECT TERMSaugmented reality, military sand tables, simulation, training, tangible user interface16. SECURITY CLASSIFICATION OF:a. REPORTUnclassifiedb. ABSTRACTUnclassifiedc. THIS PAGEUnclassified17. LIMITATIONOFABSTRACT18. NUMBEROFPAGESUU3419a. NAME OF RESPONSIBLE PERSONCharles Amburn19b. TELEPHONE NUMBER (Include area code)407-384-3901Standard Form 298 (Rev. 8/98)Prescribed by ANSI Std. Z39.18ii

ContentsList of FiguresvList of Tablesv1.Introduction12.Definition of the Problem13.ARES Components24.Potential Research Payoffs35.Relevance to the Army36.Related Research47.ARES Current Demonstration Capabilities78.ARES Ongoing Research and Development99.Desired Future Research Questions and CapabilitiesDevelopment1010. ARES Research Methodology1211. ARES Research Experiments1312. ARES as a Service1313. ARES Reuse of Existing Technologies1414. Evaluations1615. Demonstrations and Key Leader Engagements17iii

16. The Way Forward1817. References20List of Symbols, Abbreviations, and Acronyms24Distribution List26iv

List of FiguresFig. 1ARES components .2Fig. 2Terrain recognition process.3Fig. 3Army Warfighter Outcomes supported by ARES .4Fig. 4Using a tablet to place units on ARES .7Fig. 5Augmented reality vehicles and buildings on ARES .8Fig. 6Tank combat game application .10Fig. 7Concept of mobile version of ARES .12Fig. 8An open ecosystem of cooperating services and capabilities that can becombined to meet end-user needs .14Fig. 9Cadet at West Point performing OPORD briefing on ARES .16Fig. 10ARES being briefed to General Via at AUSA 2015 .18List of TablesTable 1Past research related to ARES .4Table 2ARES reuse of existing technologies .15v

INTENTIONALLY LEFT BLANK.vi

1.IntroductionThe US Army Research Laboratory (ARL) Human Sciences Campaign calls out thetopic of Virtual/Mixed and Augmented Reality as one of the research aims of 2015–2035. The goal is to use human-machine interaction to support training (ArmyResearch Laboratory 2014). Augmented reality (AR) is a type of virtual environment.In virtual reality (VR), the totality of the environment is computer generated. In AR,the real world is augmented by virtual objects or entities (Milgram and Kishino 1994;Azuma 1997). AR adds to the real world environment, where VR replaces thatenvironment. The Augmented REality Sandtable (ARES) is uniquely situated toaddress this goal using inexpensive, readily available commercial technology. ARESis a research project being conducted at the ARL Human Research and EngineeringDirectorate (HRED) Simulation and Training Technology Center (STTC) thatcombines a traditional military sand table with a Microsoft Kinect sensor to enablenew possibilities for terrain visualization and learning via a tangible user interface(Ishii and Ullmer 1997; Ratti et al. 2004).2.Definition of the ProblemSand table exercises (STEXs) are historically recognized as an effective means toconduct tactical training with an emphasis on cognitive skill development andtactical decision making (Cohen et al. 1998; Wildland Fire Lessons Learned Center2011). The ARES research testbed seeks to maintain the positive tangible attributesof the traditional sand table with the benefits of the latest in commercial off-theshelf (COTS) digital technologies. One area of research for ARES is the tangibilityaspects of simulation, also known as tangible user interfaces (TUIs). TUIs areinterfaces in which digital information can be manipulated using physical objectsin the world, such as your hand or a stylus (Ishii and Ullmer 1997); users have thecapability to physically interact with the device through touch. This report discussessome of the existing research within the scope of TUIs and how they relate to thetechnology of ARES.The desired end state is an augmented sand table platform that supports a variety ofresearch, training, and operational needs. In recent years, several high-techalternatives to the traditional sand table have been proposed, such as multitouchsurfaces (Bortolaso et al. 2014), 3-dimensional (3-D) holographic displays (McIntireet al. 2014), and immersive virtual environments (Qi et al. 2005). However, the costsassociated with developing, fielding, and sustaining such systems are excessive,especially when compared to the cost of a traditional sand table. This cost has limitedthe adoption of these modern technologies (Sutton 2004).1

3.ARES ComponentsThe ARES proof-of-concept table is a traditional sand table filled with play sandand supplemented with low-cost COTS components, which are shown in Fig. 1.They include the following equipment: A commercial projector ( 900) Microsoft’s Kinect sensor ( 200) A COTS laptop ( 3,000) An LCD monitor ( 400) Government-owned ARES software An optional web camera ( 100)Fig. 1ARES componentsThe Kinect sensor scans the surface of the sand and detects user gestures above thesand (Fig. 2). Then ARES software creates a map of the sand topography andprovides that to client applications. This allows ARES to dynamically respond tothe changing shape of the sand based on user interaction and hand gestures.2

Fig. 24.Terrain recognition processPotential Research PayoffsThis technology may provide a significant return on investment in many areas:5. Improved spatial awareness, battlespace visualization, and the developmentof a common operational picture Increased engagement, retention, and collaboration among students andoperational Warfighters Decreased time to author virtual and constructive 3-D terrains and scenarios Joint/coalition wargaming and mission planning through networked AREStablesRelevance to the ArmyWith an investment into its continued research and development, ARES willsupport an array of Army requirements. Army Warfighter Outcomes (ArmyCapabilities Integration Center 2012) applicable to ARES include the items listedin Fig. 3.3

Training T-1: Training T-2: Realistic, Integrated and Mission Command-centric Training Environment T-3: Adaptive Training System T-4: Contemporary Leader Skills T-5: Tailored / Adaptable Learning and Training T-8: Tailorable Simulations through Gaming Applications T-9: Innovative Learning Models, Strategies and ToolsMission Command MC-2: Create Common Situational Understanding MC-3: Mission Command On-The-Move MC-5: Enable Unified Action Partner Collaboration MC-6: Create, Communicate, and Rehearse Orders MC-10: Airspace Control in Unified ActionHuman Dimension HD-6: Operational Adaptability and Decision-MakingFig. 3Army Warfighter Outcomes supported by ARES (Army Capabilities IntegrationCenter 2012)6.Related ResearchThere are related past research efforts to ARES in academia, industry, andgovernment. Table 1 provides a comparison between related past research andARES.Table 1First author(year)Kirby(1995)Vaglia(1997)TitleNPSNET: SoftwareRequirements forImplementation of a SandTable in the VirtualEnvironmentThe Virtual Sand TableAlexander(2000)Visualisation ofGeographic Data inVirtual EnvironmentsMcGee(2001)Creating TangibleInterfaces by AugmentingPhysicalObjects with MultimodalLanguagePast research related to ARESCompletely virtualinterface usingModular SemiAutomated ForcesNoTangibleuserinterfaceNoUses a Stylus andVoice Recognitionto interact withinterfaceProjection, SemiImmersive Displaywith someinteractivityUses voice and penrecognition as wellas token placementto support mapbased esUnknownYesType of owcost

Table 1First author(year)Past research related to ARES (continued)TitleWisher(2001)The Virtual Sand Table:Intelligent Tutoring forField Artillery TrainingIshii(2002,2004)Bringing Clay and Sandinto Digital Design —Continuous Tangible UserInterfacesLocalisation andInteraction forAugmented MapsReitmayr(2005)Kobayashi(2006)Collaborative SimulationInterface for PlanningDisaster MeasuresCouture(2008)GeoTUI: A Tangible UserInterface for GeoscienceJung(2008)Virtual Tactical Map withTangible AugmentedReality InterfaceKalphat(2009)Tactical Holograms inSupport of MissionPlanning and TrainingHilliges(2009)Interactions in the Air:Adding Further Depth toInteractive TabletopsHaihan(2010)Research on theTechnology of ElectronicSand Table Based on GISMartedi(2010)Foldable AugmentedMapsComputer-basedtutoring systemwith 2-D and 3-Dperspective viewsProjection sand /clay-based tangibleuser sYesDependsonSystemYesProjection-basedmap augmentationusing a PersonalDigital Assistant(PDA) as aninteraction deviceProjection-basedtabletop estingProjection-basedtabletop withmultiplemanipulatorsTactical map withbrowser andmanipulators /tangible markersTested terrainholographic imageswith Soldiers tosupport missionsUses sensors andrear projection toallow formanipulation abovethe tabletopVirtualenvironment YesNoNoNoNoNoNoNoUsing a HeadMounted Display(HMD) and mapswith intersectiondots; maps can befolded and still useaugmented displayNoYesNoYesType of interface5ProjectionHumansubjectstestingYes

Table 1First author(year)Past research related to ARES (continued)TitleTateosian(2010)TanGeoMS: TangibleGeospatial ModelingSystemSchneider(2011)Benefits of a TangibleInterface forCollaborative Learningand InteractionZhang(2011)AR Sand Table withVSTAR SystemHarshitha(2013)HCI Using Hand GestureRecognition for DigitalSand ModelSchneider(2013)Preparing for FutureLearning with a TangibleUser Interface: The Caseof NeuroscienceBortolaso(2014)Design of a Multi-TouchTabletop for SimulationBased TrainingJianping(2014)An ApplicationDevelopmentEnvironment forCollaborative TrainingSand TableGIS-Based EnvironmentalModeling with TangibleInteraction and aking a Virtual SandTable Based on Unity 3DTechniqueZhao(2014)Research and Design onPower SUPPLYNETWORK Sand TableExercise SystemMa (2015)Using a Tangible Versus aMulti-touch GraphicalUser Interface to SupportData Exploration at aMuseum ExhibitTangible geospatialmodeling systemwhich uses a laserscannerProjection-basedtangible userinterface todetermine objectlayoutAugmented RealitySand table withviewer and anattached cameraUses projections,mirrors, camerasand terrain withgesture controlUnderneathTabletopProjection,Tangible UserInterfaceTabletop Commandand ControlTangible UserInterfaceMulti touchtabletop YesNoYesNoYesNoYesNoNoFollow-up toTateosian (2010):Tangible interfaceusing Kinect andsand to formterrainsFull VirtualEnvironmentYesYesYesNoNoNoUnknownNoVirtual Displaywith ElevationMaps for PowerGrids using sandtable principlesTabletop TangibleUser Interface withManipulatorsNoNoUnknownNoNoYesNoYesType of interface6ProjectionHumansubjectstestingNo

7.ARES Current Demonstration CapabilitiesARES currently has the following capabilities: Projection of Images, Maps, and Video From Any Source onto the SandARES can be logged into a VBS3 (Virtual Battlespace 3) or OneSAF (OneSemi-Automated Forces) scenario, for example, and display a top-down“instructor’s view” of the scenario in real time. Export of the User-Shaped Sand as a 3-D Terrain FileCreated terrains (and potentially, scenarios) can be imported into a varietyof simulation applications, such as VBS3 or OneSAF. Placement and Labeling of UnitsA subset of Military Standard (MIL-STD) 2525C military symbols and iconsand tactical graphics can be created, labeled, and placed for mission planning;these plans (“scenarios”) can then be saved for later reuse (Fig. 4).Fig. 4Using a tablet to place units on ARES Assisted Terrain CorrelationThrough visual aids (e.g., color schemes and contour lines), ARES providesusers with basic guidance in shaping the sand to match a previously saved3-D terrain. Universal Serial Bus Human Interface DeviceCapability for ARES to be viewed as a standard interface device (like amouse or keyboard) so that users can plug ARES into a simulation and usehand tracking to navigate menus as if their hand were the mouse.7

Hand/Gesture TrackingDetects and tracks the presence of a hand, or hands, above the sand andallows for software responses to be linked to where a user points. Video TeleconferenceAn inexpensive commercial web camera and integrated software allowdistributed users to communicate and collaborate in real time. Through theuse of the integrated color camera on the Kinect sensor, users are alsoprovided a top-down view of the collaborator’s table. Augmented Reality (AR) PeripheralsLayers of data can be displayed through AR goggles or AR apps on a tablet.For example, with collaboration from Marine Corps Systems Command,helicopters that hover and fly routes above the sand have been demonstratedas an early concept. When the Soldier Training Enhancement Package(STEP) 3-D, another STTC research project, is incorporated, 3-D buildingsand vehicles pop up from the terrain (Fig. 5).Fig. 5Augmented reality vehicles and buildings on ARES8

Contour LinesThe system can display terrain features and hypsometric displays (terraincontour lines) to mimic topographic maps. Line of Sight (LOS)The system allows for the dynamic display of a selected unit’s LOS eitherbased on the sand’s topology or on synthetic environment data provided inthe Layered Terrain Format (LTF).8.ARES Ongoing Research and DevelopmentThe following ARES research and development efforts are ongoing: Effects of Different MediumsOther mediums (e.g., different types of sand, clay, magnets, 3-D printedrelief maps) offer different benefits, such as the ability to represent verticalsurfaces. The effects of using these alternatives (realism, accuracy,immersion) are being investigated to understand how the selection ofmedium changes human performance factors. Expansion of Unit LibrariesMore of the MIL-STD-2525C unit iconography is being incorporated tosupport wider applications. Shared Table StatesNetworked tables will be able to share/compare topography data and unitplacement to support distributed collaboration (training, planning,operational support) across the room or across the globe. Rapid Authoring of Training ContentAnalysis and design of a user interface allowing end users to create theirown instructional content on ARES, including the ability for learners toanswer questions naturally by pointing to locations or units on the sand. Operational Unit Research and EvaluationCoordination has begun to conduct experiments on usability, effectiveness,transfer, and operational impact with operational military units, such as the3rd Infantry Division at Ft. Benning. Assisted Terrain CorrelationThe existing base functionality will be examined for enhancements to speedand usability.9

Publishing of ArchitectureThe ARES architecture and Application Programing Interface are beingpublished for community feedback and development of end-userapplications. ARES’ Relationship to Affect/EngagementResearch is planned to assess the affective change caused by the interactionwith ARES. Sample Application: Tank Combat GameA simple tank combat game (Fig. 6) is being developed to evaluate,demonstrate, and document the capability for end users to easily developtheir own custom apps that leverage the ARES platform.Fig. 6Tank combat game application ARES ScalabilityBased on user requirements, a squad-sized version of ARES (7 4 ft) isunder construction, and future scalability needs are being evaluated. After-Action Review ToolsDevelop and evaluate tools to support after-action reviews of simulations orlive training events.9.Desired Future Research Questions and CapabilitiesDevelopmentThe following list consists of capabilities we aim to develop and various researchquestions: Dynamic Projection MappingSupport for complex surfaces so projected images do not look “warped” onsand topology.10

Connection to Geographic DatabasesResearch is needed to determine the best method of providing cues to theuser that would prompt them to shape the sand to correlate to ageographically specific area of operation. Hand/Finger Tracking for Unit Placement and CommandIncorporating tracking capabilities that will provide users the ability toselect and move objects by pointing at locations on the sand. Refinement ofhand tracking sensitivity is needed to support the development of a gestureset that can be used to intuitively command and control units. Military Map FeaturesInclusion of military overlay data, such as grid lines, military grid referencesystem coordinates, tactical landmarks, and elevation notation. Augmented Reality to Improve Data Visualization and DecreaseCognitive LoadTechnology that will allow certain information to “pop” up off the sandwhen viewed through a mobile device or AR goggles. For example, userscan see communications networks or tall structures that may impede airsupport above the topography. Affective Benefits of ARES as a TUI in a Military ContextResearch study to determine if ARES users see an increase in engagement,collaboration, and interactions and what effect that has on learning andperformance outcomes. Incorporation into Live EventsIntegrated with Force XXI Battle Command Brigade and Below (Army) /Instrumented-Tactical Engagement Simulation System (Marines)technology to import and display unit locations and states in real time onARES for improved understanding of the operational picture. Man-Portable Form FactorLeverage COTS mobile device sensors and mini-projection technologies toprovide a reduced-scale version that can be carried by a user and projectedonto any surface (Fig. 7).11

Fig. 7Concept of mobile version of ARES Can ARES be used for more efficient terrain and scenario generation?The capability to author and export 3-D terrains and scenario settings instandardized formats will be developed and tested. This will allow subjectmatter experts (SMEs) and instructors to create content on ARES and exportthe content to other simulation and game applications (OneSAF, VBS3,etc.) more efficiently than via traditional methods and without requiring thehighly specialized skills of platform-specific scenario authors and 3-Dartists. Can distributed users collaborate more effectively?The distributed, accurate common operational picture on ARES will be usedas a testbed to determine if joint and coalition partners can collaborate ontasks, such as exercise planning and tactical war-gaming in a more intuitiveand cost-effective manner.10. ARES Research MethodologyARES research began with a literature review of current sand table use, specificallySTEX (i.e., how they have been used historically and best practices for their use);studies on related topics, such as the benefits of tangible user interfaces; and theeffectiveness of virtual sand tables and similar systems. A market survey was alsodone to discover the state of the art in related technologies.Next, an analysis was conducted to identify potential use cases, military end usersand stakeholders, and core functionalities and features that users would require ofARES. This analysis included trips to the 199th Infantry Brigade at Ft. Benning andto The Basic School at Quantico Marine Corps Base to speak with and observeSMEs. SMEs from both services were enthusiastic about ARES and provided usecases where ARES is capable of enhancing training throughout their leadership12

courses. The findings of this analysis were used to define the initial capabilities andscope of the ARES proof-of-concept table along with identifying the requiredresearch.Using only mission funding internal to STTC, we constructed an ARES proof-ofconcept table to determine feasibility to support research studies and to demonstratethe concept to potential stakeholders, collaborators, and users for requirementsdefinition.11. ARES Research ExperimentsSeveral studies using ARES are being conducted to define the benefits of sand tableinteractions: Charles Amburn, principal investigator for the ARES project, is conductinga study with 2 main objectives: 1) to determine the extent to which ARESis capable of significantly improving spatial knowledge acquisition aboveand beyond existent training mediums (paper map and 2-D display of 3-Drendered objects) and 2) to provide a foundation for establishment of alearner-centric environment using novel technology. Essentially, thehypothesis is that users viewing maps on the sand are more effective andefficient than those using 2-D printed maps or even 3-D digitalrepresentations presented on a 2-D plane (e.g., an LCD monitor or tablet). Michael Boyce is currently conducting a study, “The Effect of Topographyon Learning Tactical Maneuvers”, wherein he is assessing the effects ofARES’ 3-D surface on learning tactical military knowledge. TheDepartment of Military Instruction at the United States Military Academy(USMA) is assisting with the pilot study, which is planned to become a fullstudy to be conducted at USMA in FY16.12. ARES as a ServiceARES is built on a service-oriented architecture that is modular and extensible,allowing for customization to meet individual user needs. Now that the ARESsoftware and interface protocols are published, other users (e.g., Army, Departmentof Defense [DOD], and other government agencies) can leverage the technology innew ways. This flexible approach and open interface allows for 2 primary benefits: ARES can be updated easily and cost effectively, taking advantage of thelatest advances in sensor, projector, and augmented reality technologies.13

ARES becomes a platform, similar to iOS or Android, where users willcreate their own apps to support their individual needs.Figure 8 illustrates the goal of extending core capabilities with applications andservices to support a myriad of training and operational tasks.Fig. 8An open ecosystem of cooperating services and capabilities that can be combined tomeet end-user needs13. ARES Reuse of Existing TechnologiesARES uses an open systems architecture that is modular and extensible. It alreadysuccessfully leverages several existing technologies developed by ARL. Table 2outlines reused technologies.14

Table 2Integrated technologyRapid Unified Generation ofUrban Databases (RUGUD)http://www.rapidterrain.comLayered Terrain Format (LTF)(Peele et al. 2011)Tactical Terrain Analysis (TTA)App(Borkman et al. 2010)Soldier Training EnhancementPackage (STEP) 3D App(Roberts and Chen 2009)ARES reuse of existing technologiesTechnology descriptionRUGUD is a governmentoff-the-shelf data processingframework that can assist inthe conversion of terraindata into alternate formats tobe used by otherapplications.LTF is a high-resolution 3D terrain representationformat that can be renderedon hardware and softwareconstraint devices, such as ahand-held.The TTA app provides aline of sight (LOS) servicethat can provide a fanrepresenting the field ofview at a given location,incorporating occlusion forareas that are blockedbecause of obstructions.The STEP 3-D App is anaugmented reality tool thatallows for buildings toappear in 3-D when viewedthrough a tablet.Upcoming integrationsTechnology descriptionGeneralized IntelligentFramework for Tutoring (GIFT)(Sottilare et al. 2012)GIFT is a modular, serviceoriented architecturedeveloped to assist in theauthoring and developmentof Intelligent TutoringSystems.WebMSDE is a scenariogeneration tool that can beexecuted within a browser.It has a robust user interfaceand can efficiently generatescenarios in a wide varietyof formats.Web-Based Military ScenarioDevelopment Environment(WebMSDE)(Marshall et al. 2013)15Specific integration withARESRUGUD is used withARES as a mechanism totranslate other terrainformats into formats thatARES can understand.LTF is used with ARES asa part of its tablet-basedinteraction platformfacilitating missionplanning and LOS.ARES uses the TTA appfunctionality to determineLOS given a particularterrain topology, such that itchanges dynamically as thesand is manipulated.STEP 3-D is used on ARESin conjunction withrecognition markers tocause buildings to show 3D height on the sand.Specific integration withARESGIFT will provide ARESwith the capability to authoradaptive trainingexperiences that can adjustto the needs of anindividual learner.WebMSDE may provideARES with distributedOneSAF scenariogeneration capability aswell as access to its existingscenario libraries.

14. EvaluationsSeveral instances of the ARES proof of concept have been strategically coordinatedwith stakeholders for continued evaluation of current capabilities and thedevelopment of future research requirements. For example, the Office of NavalResearch (ONR) has provided funding for ARES to be evaluated in Quantico by theMarine Corps’ The Basic School. Feedback has been positive, and they expressedeagerness to run force-on-force training with networked ARES tables. Additionaldesired capabilities have been submitted through ONR, including the intent forARES to scale up to cover the large 16-ft battalion-sized sand tables they use.Per a March 2014 Memorandum of Understanding (MOU) with USMA at WestPoint, STTC provided an ARES prototype to the Engineering Psychology Programof the Department of Behavioral Sciences and Leadership at USMA where first classcadets (seniors) in the Engineering Psychology program have conducted initialusability, workspace, and workload assessments as part of their final human factorsdesign course. ARES is also being used by cadets learning to conduct OperationalOrder (OPORD) briefs as a part of their military instruction curriculum (Fig. 9).Fig. 9Cadet at West Point performing OPORD briefing on ARES (image courtesy of theEngineering Psychology Prog

tactical decision making (Cohen et al. 1998; Wildl and Fire Lessons Learned Center 2011). The ARES research testbed seeks to maintain the positive tangible attributes of the traditional sand table with the benefits of the latest in ommercial offc-the-shelf (COTS) digital techn ologies. One area