Enhancing Multi-Domain Training with AI-Driven Virtual and Augmented Reality

  • Published
  • By JSOU/NATO SOF HQ

In the last few years, there have been a multitude of innovations in virtual, augmented reality, and AI-driven simulation technologies. How can these innovations be utilized to enhance decision-making, adaptability, and strategic response in United States, partner nation, and NATO SOF, particularly in the context of complex, multi-domain operations? Specifically, how can synthetic environments be integrated with AI to create immersive and realistic training scenarios? How do these environments improve adaptability and strategic response in varied operational scenarios? What are the measurable impacts of these technologies on decision-making accuracy and operational effectiveness?


  • Gillispie, Cap. John, "BBP on Advanced Tools and Concepts for Accelerated Expertise," SOS AUAR 24G, 25 pgs.  
    • This paper discusses how AR, VR, and simulations serve as adaptive training aids to accelerate the development of critical decision-making skills. It explains that AI and autonomous behaviors can be integrated into training to ensure scenarios remain unpredictable, mirroring real-world complexities. By incorporating "shadowbox training" and cognitive task analysis, these technologies capture subject matter expert knowledge and expose trainees to unpredictable stimuli, thereby rapidly enhancing their mental models, adaptability, and proficiency in complex environments.
  • Lee, Maj. Jonathan E.C., "Flying in the Metaverse: How to Evaluate Immersive Technologies for Flight Training," AFGC thesis, 2024, 27 pgs. 
    • This paper addresses the integration of XR into flight training to create synthetic, physics-based environments where operators can safely exercise perception, recognition, interpretation, and decision-making skills. It answers how these environments improve adaptability by exposing crews to degraded systems, diverse electromagnetic spectrum factors, and dynamic combat scenarios that cannot be safely or securely replicated in live, open-air training. The paper discusses measuring operational effectiveness through a "Training Effectiveness Ratio" and notes measurable impacts, such as a synthetic part-task trainer saving approximately 2.3 live sorties to reach required proficiency when compared to a control group.
  • Panteleon, Maj. Bridget K., and Maj. E. Minnenne Holloway, "Advancing Military Training: AI-Enhanced Wargaming and Large-Scale Exercises," AF Fellows portfolio (MIT, Lincoln Labs), 2024, 62 pgs. 
    • This paper explores how AI can transform military training by generating synthetic data, creating adaptive learning environments, and employing AI agents that mimic adversary tactics for realistic opposition. It explains that integrating AI into wargames and large-scale exercises improves strategic response and adaptability by personalizing training, adjusting complexity in real-time based on a unit's performance, and allowing trainees to use predictive analytics to evaluate the potential outcomes of their strategic decisions. The paper measures the impact of these technologies through the reduction of training event preparation time from months to minutes, optimizing resource utilization and enhancing joint force readiness.
  • Stein, Maj. Nicholas, "Military Training Extended into the Virtual Environment: A Learner-Centric Approach to Preparing Next Gen Warfighters," AFGC thesis, 2024, 54 pgs. 
    • ​​​​​​​This paper analyzes the effectiveness of Extended Reality (XR)—which encompasses VR, AR, and Mixed Reality (MR)—in developing cognitive, perceptual, and motor skills for military personnel. It answers how immersive environments improve decision-making accuracy by detailing programs like the Mixed Reality Learning Experience (MRLx) and Pilot Training Next (PTN), where trainees practice rapid situational assessments and emergency responses in safe, repeatable, 360-degree virtual environments. The paper notes measurable impacts, such as higher graduation rates with less physical flight time in pilot training and explains how AI-assisted XR platforms provide immediate, objective feedback via biometric data (such as heart rate and eye tracking) to enhance decision reflection and operational effectiveness.