SBIR-STTR Award

Epipolar-Plane Imaging for Robot 3D Vision
Award last edited on: 12/23/2023

Sponsored Program
SBIR
Awarding Agency
NSF
Total Award Amount
$1,224,257
Award Phase
2
Solicitation Topic Code
R
Principal Investigator
Henry (Harlyn) Baker

Company Information

EPIImaging LLC

414 Paco Drive
Los Altos, CA 94024
   (650) 949-1052
   N/A
   www.epiimaging.com
Location: Single
Congr. District: 18
County: Santa Clara

Phase I

Contract Number: 2015152
Start Date: 5/15/2020    Completed: 4/30/2021
Phase I year
2020
Phase I Amount
$224,850
The broader impact/commercial potential of this Small Business Innovation Research (SBIR) Phase I project will advance the development of detection systems for autonomous vehicles. The proposed technology takes advantage of trends in price, performance, and quality of imagers and processors driven by the proliferation of these devices in smartphones. The technologyalso provides key capabilities such as integrated color information, detection over extended depths, scene segmentation and tracking, and better performance in inclement weather or under poor visibility. This leads to improvements in three-dimensional (3D) vision and object modeling that will have significant commercial impact in accelerating the development and deployment of autonomous and semi-autonomous vehicle navigation and assistance. This will lead to higher commuting efficiency, reduced traffic fatalities, reduced traffic congestion, and reduced pollution.This Small Business Innovation Research (SBIR) Phase I project will establish the technical capabilities and advantages of passive sensing image-based multi-camera EPI Epipolar-Plane Imaging (EPI) analysis for autonomous vehicle (AV) ranging. The research objective is to advance the development of EPI analysis and compare it to Light Detection and Ranging (LiDAR) systems. The research will extend an existing EPI-based module to incorporate new hardware and software to achieve these results, including accuracy and precision comparable to LiDAR at distances of 200 m and beyond, feature discernment superior to LiDAR, higher levels of semantics in presented range information, and operation in inclement weather and discrete obscuration.This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.

Phase II

Contract Number: 2242216
Start Date: 9/15/2023    Completed: 8/31/2025
Phase II year
2023
Phase II Amount
$999,407
The broader/commercial impact of this Small Business Innovation Research (SBIR) Phase II project seeks to improve robotic interactions with the humans. Currently, robots are involved in large sectors of society including logistics, manufacturing, autonomous navigation, video communication, remote supervision of complex mechanical maintenance/repair tasks, support in battlefields and disasters, and interactions in various training, educational, and interventional scenarios including telemedicine. This technology may offer more effective automation in the workplace through higher quality 3D sensing, greater precision visualization and increased worker quality of life. The technology addresses precision and reliability of passive 3D scene measurements. This Small Business Innovation Research (SBIR) Phase II project addresses the acquisition of reliable and precise three-dimensional representations of a scene from passively acquired image data for use in navigation, grasping, manipulation, and other operations of autonomous systems in unrestricted three-dimensional spaces. This technology has been a long-standing challenge in the computer vision field, with many efforts providing adequate solutions under certain conditions, but lacking applicability across a breadth of applications. Other approaches typically deliver inaccurate results where there are, for example, repeated structures in the view, thin features, a large range in depth, or where structures align with aspects of the capture geometry. Based on the matching of features across images, current technologies fail when features have similar appearance. This technology removes the uncertainty of this process through a low-cost use of over-sampling, using a specific set of additional perspectives to replace the ?matching? with deterministic linear filtering. Increasing the reliability and precision of 3D scene measurements will open new opportunities for robotic interactions with the world. Success in this project will advance the underlying light-field technology to broader application areas where human-in-the-loop operations using artificial reality/virtual reality (AR/VR) or mixed reality (such as remote collaboration and distance interaction) depend on accurate and responsive visualization and scene modeling, reducing influences of vestibular and proprioceptive mismatch that can cause disruptive effects such as nausea.This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.