SBIR-STTR Award

Bio-Inspired Autonomous Vision
Award last edited on: 2/23/2007

Sponsored Program
SBIR
Awarding Agency
DOD : DARPA
Total Award Amount
$848,450
Award Phase
2
Solicitation Topic Code
SB043-039
Principal Investigator
John Merchant

Company Information

RPU Technology Inc

173 Dedham Avenue
Needham, MA 02492
   (781) 444-9426
   merchant.j@comcast.net
   www.rpuinc.com
Location: Single
Congr. District: 04
County: Norfolk

Phase I

Contract Number: ----------
Start Date: ----    Completed: ----
Phase I year
2005
Phase I Amount
$98,684
Vision sensors are an essential component of autonomous systems of all types, now receiving rapidly increasing attention (UAV, UGV, ATR etc). However physical vision systems that substitute for human vision in autonomous systems have very much less capability for the visual recognition tasks that must be performed. This research and development exploits the fact that, to a large extent, the superior performance of human vision is due to the entirely different type and much reduced quantity of visual information it derives and uses so effectively. Whereas physical image sensors derive information by high density Nyquist sampling, over 99.99% of the visual field human vision uses visual information derived by low density variance sampling. The initial bio-inspired development is a practical demonstration of three autonomous system recognition tasks that will be performed much more effectively using variance instead of Nyquist information. This information is easily derived by variance sub-sampling the output of any conventional image sensor. A subsequent bio-inspired development would be to implement this (very simple) sub-sampling operation directly on the focal plane of the image sensor, just as is done in the human retina. The result would be a small, inexpensive, very high performance robotic-eyeball for autonomous systems

Phase II

Contract Number: ----------
Start Date: ----    Completed: ----
Phase II year
2006
Phase II Amount
$749,766
A Bio-Inspired Collision Avoidance system. Vision sensors are an essential component of autonomous systems of all types, now receiving rapidly increasing attention (UAV, UGV, ATR etc). However physical vision systems that substitute for human vision in autonomous systems have very much less capability for the visual recognition tasks that must be performed. This research and development exploits the fact that, to a large extent, the superior performance of human vision is due to the entirely different type and much reduced quantity of visual information it derives and uses so effectively. Whereas physical image sensors derive information by high density Nyquist sampling, over 99.99% of the visual field human vision uses visual information derived by low density variance sampling. The initial bio-inspired development is a practical demonstration of three autonomous system recognition tasks that will be performed much more effectively using variance instead of Nyquist information. This information is easily derived by variance sub-sampling the output of any conventional image sensor. A subsequent bio-inspired development would be to implement this (very simple) sub-sampling operation directly on the focal plane of the image sensor, just as is done in the human retina. The result would be a small, inexpensive, very high performance robotic-eyeball for autonomous systems.

Keywords:
VISION, SAMPLING, VARIANCE, RECOGNITION, FOVEA, AUTONOMOUS, EYE, NYQUIST