SBIR-STTR Award

Vision-Based Obstacle Avoidance Using Active Scene Segmentation
Award last edited on: 5/27/2008

Sponsored Program
SBIR
Awarding Agency
DOD : Navy
Total Award Amount
$1,012,833
Award Phase
2
Solicitation Topic Code
N04-178
Principal Investigator
Eric J Corban

Company Information

Guided Systems Technologies Inc (AKA: GST)

630 Red Oak Road
Stockbridge, GA 30281
   (770) 898-9100
   corban@mindspring.com
   www.guidedsys.com
Location: Single
Congr. District: 13
County: Henry

Phase I

Contract Number: N00014-05-M-0002
Start Date: 10/14/2004    Completed: 7/14/2005
Phase I year
2005
Phase I Amount
$70,000
Successful operation of autonomous flight systems in uncertain environments is currently limited by the lack of practical obstacle avoidance systems. In the proposed effort, we shall exploit recent and on-going advances in the fields of image processing, estimation, real-time path planning, and guidance/control to demonstrate successful flight through a cluttered and uncertain 3-D urban environment using simple imaging sensors and appropriate custom-developed processing hardware. In particular, we propose the innovative use of variational methods to dynamically segment scenes, leading to a fast and natural approach to estimating the location of unknown 3-D obstacles. During phase I the feasibility of real-time algorithm execution will be established using current generation processors, and robustness to transient sensor data, distortion, and obscuration evaluated. A systems engineering analysis will be used to fully characterize system performance as it relates to a variety of design parameters and constraints. In phase II, efficient means to produce the appropriate level of image understanding required for successful operations will be further developed, the algorithms will be fully realized in hardware and software, and the prototype system will be fully evaluated in UAV flight operations over urban terrain

Phase II

Contract Number: N00014-07-C-0150
Start Date: 3/12/2007    Completed: 3/12/2008
Phase II year
2007
Phase II Amount
$942,833
A program is proposed to develop, flight demonstrate and transition a robust capability to autonomously detect and avoid obstacles and features typically encountered in three-dimensional flight by relatively small fixed or rotary wing unmanned air vehicles in low-altitude urban operations. There are two basic parts to the problem: (1) continually detecting and modeling the observable 3-D obstacle field in real-time, and (2) autonomously guiding the vehicle to accomplish a given set of mission objectives while ensuring there are no collisions with obstacles. The key innovation of the proposed approach is development of a technique to accomplish the stated objective that requires only a single 2-D image stream (available on essentially every small unmanned air vehicle in the inventory today). Demonstrated phase I algorithms will immediately move into practical implementation and evaluation on an existing low-cost flight test vehicle. Meanwhile, hardware and software design specific to integration on the Puma small unmanned air vehicle will be completed. Option 1 will demonstrate improvements in Puma operational effectiveness using the developed technology in simulation, and complete fabrication and development of Puma specific hardware and software. A second option will provide for technology demonstration on a Puma flight system in collaboration with Aerovironment, Inc.

Keywords:
Unmanned Air Vehicle, Obstacle Detection, Obstacle Avoidance Guidance, Control, Image Processing, Vision, Autonomous Flight