The Navy needs automated ways to be able to understand the air, land, surface and subsurface surroundings of its unmanned systems based on sensor data. ASRIs goal for this research effort is to expand the capability for the Navys unmanned aerial systems (UAS) to monitor, detect, track and classify relevant targets simultaneously in land, surface, and subsurface domains. ASRIs solution will improve image understanding with multimodal fusion and artificial intelligence (AI)/machine learning (ML) algorithms to extract more information from lower quality images. In order to accomplish this, ASRI proposes to build upon SIMITAR (System for Improved Multi-INT Target Acquisition and Recognition), which is ASRIs low-SWaP, scalable system for onboard exploitation of airborne Multi-INT sensors. Under this effort, SIMITAR will be adapted for the Navy to add IR, improve HSI processing, and add AI/ML techniques to extract more information from lower quality images. The system will be tested on sea littoral environments as well as land.
Benefit: ASRI's topic solution will enable Navy UAS platforms like the Blackjack/Integrator to autonomously detect, track, and classify potential threats with high accuracy in land, sea, and littoral environments. By capturing multi-modal live sensor feeds from the onboard sensor payload our system will use multi-INT sensor fusion and AI/ML processing to analyze the scene in real time to find threats with high confidence. The work performed under this contract to improve threat detection and classification in lesser quality imagery will expand the range of conditions that the system can reliably operate in and increase situational awareness. Our solution leverages prior SBIR investment to reduce risk.
Keywords: Neural networks, Neural networks, Target Detection, Unmanned Aerial Systems (UAS), super-resolution, Machine Learning (ML), Artificial Intelligence (AI), ATR, data analysis