VRRs existing successful TRL 6 level Fused Augmented Realities User Interface (FAR-UI) will be evolved to fulfill the single amphibious integrated precision augmented-reality navigation system (SAIPAN) requirements. Specifically, VRRs/CSUs FAR SAIPAN version shall be network linked to COBRA, MINENet Tactical and JABS to fulfill the USN/USMC vision for the Navys Amphibious Breaching System (ABS) vision by providing a modern version of the Augmented Reality Visualization for the Common Operating Picture (ARVCOP). FAR Aerial and Ground versions are already network linked to Nett Warrior, Tactical Attack Kit (both WinTAK and ATAK), Marine Fires App, and others GOTS/COTS networks providing the foundational basis for a modern SAIPAN ARVCOP. FAR innovatively includes an underlying correlated Synthetic Vision 3D terrain for accurately anchoring AR objects and new classes of enhanced situational awareness visualizations. Innovations include minimaps that intuitively visually link with AR Gaze Guidance Lines the same objects/positions to their real-world counterparts (increasing performance and enhancing situational awareness, while reducing mental workload and delays doing dimensional transforms and reducing errors); enable 360 degree birds eye view of ownship or any user selected areas via our Instant SA insets, placement of 2525 Icons and graphics by manual inputs, and our next generation Touch&Speak, Gaze&Speak, Gaze&Touch, Wearable AR Controls with Haptic Feedback and other AR based user selectable options; fusion of IR with visuals, support for AR optical see-through & video see-through on Android or Windows smartphones, tablets, laptops, AR-HMDs as well as modular add-ons to existing visual systems. Key strengths of VRRs approach derive from FAR SAIPAN versions running on HoloLens2 AR-HMDs as functional emulators for evolving Integrated Visual Augmentation System (IVAS ~$0.5B & expanding) Program of Record AR-HMDs. AR-HMDs as wearable tech can be utilized on ALL the vehicles/ships required, as well ALL other vehicles/ships only needing manual entry or network sharing of obstacles. It is important to emphasize beyond the preferred AR-HMD solutions, USN/USMC options also include running FAR SAIPAN version on mobile devices and laptops, with simple dashboard/cockpit mounts borrowed from the civilian markets. If required VRR with our Lockheed Martin team partner can go through the ship modification processes for mounted capability for larger crafts/vehicles. FAR SAIPAN versions adds support for amphibious operations as required; and utilizes the same core software and device options that USN/USMC and Army warfighters will utilize FAR enhancements for: Call For Fire and Close Air Support (ONR/USMC sponsored); For Control of Tactical Drones [UAS,UGV,USV] (OSD/Army sponsored); and SWARMS of UAS (ONR/NESTT sponsored). VRRs/CSUs cognitive science team efforts will ensure that SAIPAN goals are achieved for intuitive ARVCOP type displays.
Benefit: VRRs Fused Augmented Realities SAIPAN version anticipated benefits include: * Modern Augmented Reality Visualization for the Common Operating Picture (ARVCOP) Displays and Devices to complete the Navys Amphibious Breaching System (ABS) with integrations to the three existing systems of (1) Coastal Battlefield Reconnaissance Asset (COBRA), (2) MINENet Tactical, (3) JDAM Assault Breaching System (JABS) * Intuitive displays of real-time information provided by both sensors and humans integrated with existing libraries of information to provide real-time common operational pictures; In addition to the ABS components to also reuse our already successful integrations with Nett Warrior, TAK, and Marine Fires App; Essentially any well-structured system or network with position location information (PLI) sharing standards. *Multi-modal integrated precision augmented-reality navigation software that integrates the GPS signals (e.g., GPS information from existing and future precision navigation suites (e.g., DoD Assisted GPS Receiver (DAGR), MAPS)) combined with the vehicle/crafts receivers inertial guidance data and obstacle date to enable display of virtual and/or augmented reality marked lanes that an operator will use to assist in maneuvering. VRR also includes additional mode candidates of (passive) differential (active) GPS, RTK GPS, passive 3D terrain pattern matching (assuming viewable landmarks), a new class of AR graphics named uncertainty blur graphics that visually displays inaccuracy in obstacle locations. *Flexible Hardware Displays and Devices selections using Android or Windows operating systems ranging from mobile systems (Mobile COTS smartphones & tablets with ruggedized cases or ruggedized laptops from Nett Warrior End User Devices options); to add-on modules for existing display systems; to state-of-the-art AR-HMDs such as HoloLens2 to ultimate options such as the IVAS program of record AR-HMDs. Further, IVAS also includes IR sensors to fulfill the SAIPAN night use requirements; as well as integration with advanced small arms weapon sights to provide even longer-range visuals. *Navigational and Situational Awareness Enhancements for military/commercial/civilian large ships, small craft, and vehicles. *FAR integrated options also include support for Call For Fire/ Close Air Support, Control of Tactical Drones or optionally manned vehicles for MUM-T Use Cases, and Control of Swarms. * FAR for Control of Tactical Drones already developed under ~$2.5M sponsorship by OSD/Army is a single common controller interface analogous to SAIPANs planned reductions in risk & costs by this projects investing in a single integrated add-on augmented/virtual driver display that provides the required precision navigation capability. * FAR SAIPAN version is an end-to-end solution with integrated modules for embedded training, rapid terrain/sea generation (seconds), mission planning, mission rehearsal, operational use, & after actions review/debriefing.
Keywords: precision, precision, Augmented Reality, Precision Navigation, Navigation, Driver Display, virtual