SBIR-STTR Award

Flexible and Robust Miniature Guidance & Navigation System
Award last edited on: 1/17/2018

Sponsored Program
SBIR
Awarding Agency
DOD : DARPA
Total Award Amount
$1,649,497
Award Phase
2
Solicitation Topic Code
SB151-006
Principal Investigator
Suresh K Kannan

Company Information

NodeIn

12 Deerfield Terrace
Burlington, CT 06013
   (860) 288-2543
   N/A
   www.nodein.com
Location: Single
Congr. District: 05
County: Hartford

Phase I

Contract Number: D15PC00147
Start Date: 8/11/2015    Completed: 6/21/2016
Phase I year
2015
Phase I Amount
$149,958
Our Flexible Optical Guidance and Navigation System (FOGNS) is capable of reliable navigation in widely varying visual conditions using a suite of small size, low weight, power and low-cost sensors. Visual odometry, Optical Flow and low-cost Inertial sensing data are fused into a single navigation solution with an accurate measure of uncertainty. Sophisticated, yet computationally feasible algorithms are employed to extract maximum information from a suite of low-quality but low-cost, low-power optical and inertial sensors to reduce navigation uncertainty. As information from one sensor degrades due to lighting or other factors, complementary sensors that are more effective in new conditions are already contributing to keep navigation uncertainty low. Sensors drop in and out as sensing conditions change (plug-and-sense). The proposed innovative algorithms provide a minimal set of knobs that can tweak the navigation solution for either high-speed flight or precise obstacle-relative navigation serving to tightly couple perception and control for highly agile collision avoidance. A similar set of knobs are used to transition between fast topological mapping and highly accurate dense local maps when flying through tight corridors.

Phase II

Contract Number: D16PC00104
Start Date: 6/22/2016    Completed: 6/21/2018
Phase II year
2016
Phase II Amount
$1,499,539
The Flexible and Robust Guidance and Navigation in this proposal uses vision-based sensors and an innovative adaptive nonlinear optimization-based sensor fusion method. Phase II will test navigation performance by collecting data sets in various visual environments and evaluating them using standardized benchmarks. Results will also be compared to standardized datasets. An innovative hybrid dense/topological map is created to generate obstacle avoidance trajectories. A vision-based reactive collision avoidance algorithm is also included. Finally, a planning algorithm is used to construct sequences of maneuvers needed to navigate a large environment for a single UAV and then expanded to include complex coordinated missions between multiple UAVs. The decoupling of sensors and fusion allows new sensors to be incorporated easily using plug-and-perceive approach. An optional LIDAR is used to provide 3D point cloud data. Multiple flight tests are planned during the program to demonstrate the systematic maturation of the systems technology readiness level.