SBIR-STTR Award

Multimodal Interaction Technologies to Support Small Unit Leaders
Award last edited on: 9/7/2022

Sponsored Program
SBIR
Awarding Agency
DOD : Navy
Total Award Amount
$138,247
Award Phase
1
Solicitation Topic Code
N202-133
Principal Investigator
Casey Sapp

Company Information

VRTUL Inc

1617 Burgundy Road
Encinitas, CA 92024
   (407) 718-9156
   N/A
   www.vrtul.co
Location: Single
Congr. District: 49
County: San Diego

Phase I

Contract Number: N68335-21-C-0610
Start Date: 6/22/2021    Completed: 12/23/2021
Phase I year
2021
Phase I Amount
$138,247
Technology historically is ingrained in the human experience to be physical. If a Google search is needed or calling a friend it requires pushing buttons and carrying equipment around. If accessing large quantities of data is needed it requires numerous physical screens, which can lead to more wires, space limitations, and further IT complications. VRTUL’s proposal is to enable a graceful transition between Human-Computer interaction technologies which maximizes hand mobility and decreases IT requirements with a three pronged approach: 1) Identify procedures and processes which can become “hands-free” - Utilizing current state-of-the-art in multimodal input/output (I/O) hardware and methodologies any task which limits hand mobility should be re-examined for impact and efficiency. Gestures, voice, and hands-free screen controls will take precedence over physical inputs. Augmented Reality (AR), Virtual Reality (VR), and Mixed Reality (MR) technologies will be emphasized. (AR/VR/MR henceforth will be referred to as “XR” which is an umbrella term for all immersive technologies.) 2) Using Unity software, deploy a “Virtual Command Center” application which integrates spatial data streams asymmetrically into all XR hardware, tablets, phones, and wearables - Regardless if it is a tablet or a wearable a spatial database will be universally accessible and capable of asymmetric multi-user experiences. The spatial database comprises UxS sensor data, maps and terrain information, and vehicle navigational controls. The term “asymmetric” refers to a multimodal visualization methodology where all devices are able to interact with the same information at the same time. 3) Seamless UX (User Experience) Access - The applications are simple to use, the multi-modal methodologies are easy to understand, and the deployment instructions are easy to train. The significance of this opportunity means soldiers will have increased mobility and greater robotic precision. The elimination of wires also means increased space limitations, less strain on IT, and simplifying setup of command bases around the globe. With one Unity log-in any soldier would be able to visualize and take control of the data resources and even UUV robotic controls needed to complete a task. For Phase I our 4 Objectives are: - Objective 1 – Identify use cases through common tasks and mission scenarios for warfighter and UxS teaming. - Objective 2 – Identify XR end-to-end tool kits for controlling and monitoring UxS for utilization in priority environments. - Objective 3 – Provide pros and cons of each I/O modality and associated human factors principles for design. - Objective 4 - Provide concept HUD designs, diagrams, and prototype systems for proof of concept testing.

Phase II

Contract Number: ----------
Start Date: 00/00/00    Completed: 00/00/00
Phase II year
----
Phase II Amount
----