SBIR-STTR Award

DARPA-USAF Integration: Collaboration and Secure Tasking for Multi-Agent Swarms
Award last edited on: 5/11/22

Sponsored Program
STTR
Awarding Agency
DOD : AF
Total Award Amount
$799,963
Award Phase
2
Solicitation Topic Code
AF20C-TCSO1
Principal Investigator
Gareth Block

Company Information

Third Insight (AKA: Visual Semantics Inc)

4309 Adirondack Summit Drive
Austin, TX 78738
   (936) 647-5517
   N/A
   www.thirdinsight.ai

Research Institution

IHMC

Phase I

Contract Number: FA8649-21-P-0723
Start Date: 2/8/21    Completed: 5/8/21
Phase I year
2021
Phase I Amount
$49,999
Visual Semantics, Inc. (dba Third Insight) and its non-profit partner, the Institute for Human and Machine Cognition (IHMC) based in Pensacola, FL, are pleased to submit the attached Phase I STTR proposal: “DARPA-USAF Integration: Collaboration and Secure Tasking for Multi-Agent Swarms”. This effort sits within the “Information Technology” focus area to address adversary effort disruption, autonomous systems teaming, reasoning and intelligence, and human and autonomy systems trust and interaction. Third Insight and IHMC have developed complementary approaches for DARPA and the USAF to understand how multiple UAVs may collaborate in a decentralized manner. Third Insight is working closely with AFRL, AFIMSC, and the National Guard under three ongoing USAF Phase II SBIRs to provide the DOD with an AI-driven software plug-in that imparts COTS Unmanned Aerial Vehicles (UAVs) with autonomous decision-making, navigation, and planning capabilities. While such capabilities are typically intended for GPS-denied and/or contested environments, the AI’s ability to build Situational Awareness at the edge is bearing its own fruit. Perhaps most exciting is our team’s realization that multiple UAVs can share their Situational Awareness and then collaboratively reason about it. Our non-profit partner, Dr. Matthew Johnson at IHMC, is currently leading two DARPA projects (CREATE and ASIST) that focus on different aspects of autonomous systems teaming and human-machine coordination. IHMC has developed high-fidelity, 3D simulated versions of “Capture-the-Flag” to study the impact of collaboration on successful teaming. In the simplest case, UAVs on each team select from a pre-defined “playlist” of behaviors to defend the team’s flag, take offensive action, etc., based on their perceived state in the world. The approach taken by DARPA is state-of-the-art, and we expect similar approaches to be taken by the two USAF Vanguard Programs Skyborg and Golden Horde. The proposed STTR seeks to integrate teaming concepts from both the DARPA and USAF projects. Project goals are to assess the feasibility of combining both approaches so agents can: Build and share representations of dynamic threats (both to the agent and to the group) Communicate with neighboring teammates to highlight tactical and strategic advantages Balance individual goals and behaviors against those of the group Capture new knowledge, best practices, and learnings and share with the team Machine-machine cooperation will be a key enabler for Skyborg and Golden Horde. Our STTR efforts will seek to identify customers inside AFSOC and AFIMSC, as well as the National Guard, who has critical need for collaborative ISR and tracking in support of Disaster Response and Homeland Security. Lt Col Alex “Stoiky” Goldberg (TXANG), who is the TPOC for two of Third Insight’s SBIR Phase II’s, has provided a letter of support on behalf of this effort to hig

Phase II

Contract Number: FA8649-22-P-0732
Start Date: 3/10/22    Completed: 6/12/23
Phase II year
2022
Phase II Amount
$749,964
The proposed STTR Phase II submitted by Visual Semantics, Inc. (dba Third Insight) and its non-profit partner, the Institute for Human and Machine Cognition (IHMC) based in Pensacola, FL, seeks to integrate Explainable AI (XAI) and human-machine teaming with multi-agent, swarm-based command and control (C2) developed under ongoing DARPA and USAF SBIR Phase II projects. Third Insight’s GenesisTM autonomy software is a hardware agnostic “software plugin” that provides COTS and BlueUAS unmanned aerial and ground vehicles (UXVs) with 2D/3D perception, intelligent planning, and reasoning capabilities. Genesis is a patented neuro-symbolic reasoning engine, which fuses pre-existing knowledge (semantic graphs) with robust perception (deep learning) to give UXVs real-time, visual scene understanding and decision-making capabilities. Our partners at IHMC are performers on two DARPA projects (CREATE and ASIST) that produced highly scalable and robust frameworks for teaming and collaborative tasking of autonomous UAVs. Their well-known Capture-the-Flag games (both real and simulated) were the focus of our Phase I effort. Phase II will include additional teaming simulations and Search & Rescue (SAR)-focused field tests. Third Insight and IHMC engaged technical SMEs from AFRL Sensors Directorate during Phase I, and subsequently presented Phase I results (and Phase II plans) to AFSOC and SOCOM customers. Major Devin Beckwith, A5/AISUM Lead and our AFRL TPOC have each signed the attached MOU. AFRL will additionally budget FY22 contract funds in support of Phase II SAR field tests and demonstrations. The Defense Innovation Unit (DIU) and two state entities (TX-SAR and MD-SAR) also support the Phase II impact on both offensive and defensive swarm applications. USAF Lt Col Alex “Stoiky” Goldberg, Joint Technology Acquisition Innovation Officer with DIU, is the TPOC for two of Third Insight’s SBIR Phase II’s, and has provided a letter of support to highlight this project’s urgency. SOCOM has a national defense-related mission in the area of Artificial Intelligence for Small Unit Maneuvers (AISUM). We believe technology development under the subject STTR topic may eventually contribute to solving a mission need. The main goals of our involvement in this project are to: (1) develop multi-agent (swarm) techniques for collaborative tasking and command and control (C2); (2) deploy Explainable AI (XAI) and C2 software components that interoperate with open source architectures to enable decentralized and federated decision-making on board autonomous vehicles; and (3) support AFRL-based testing and hand-off of our technology to AISUM-related Programs of Record (e.g., EOTECH and SOF Warrior). Together, these capabilities and efforts will enable inter-agent collaboration and self-tasking among platforms whose different capabilities (SWAP, sensing, perception, and reasoning) require real-time coordination and teamwork to address the