SBIR-STTR Award

AI-Driven Orientation and Mobility System for the Blind
Award last edited on: 7/26/2022

Sponsored Program
STTR
Awarding Agency
NSF
Total Award Amount
$1,224,998
Award Phase
2
Solicitation Topic Code
DH
Principal Investigator
Cagri Zaman

Company Information

Virtual Collaboration Research Inc

1035 Cambridge Street
Cambridge, MA 02141

Research Institution

Massachusetts Institute of Technology

Phase I

Contract Number: 1843464
Start Date: 2/1/2019    Completed: 1/31/2020
Phase I year
2019
Phase I Amount
$225,000
The broader impact/commercial potential of this Small Business Technology Transfer Phase I project are its two intended contributions: it will advance computer vision for navigation and it will create unprecedented opportunity and independence for individuals with visual impairment. 253 million people with blindness or visual impairment around the world experience difficulty to independently navigate in new spaces. The lack of independence directly leads to disadvantages in getting education and joining the workforce. This project, NavigAid, is aimed at creating a breakthrough spatial intelligence aid offered as a mobile application which will solve spatial tasks from locating objects to identifying paths of navigation, and to generating rich descriptions of the surroundings. Solving this problem will unlock an estimated direct economic benefit of $26B annually for people with visual impairment around the world, by decreasing the need for human assistance. The main technological innovation of this project, Ally Networks, is a novel neural network architecture that can learn more robust representations than existing models are able to. The robust representations prevent the networks from making mistakes and make them more operable and useful in situations that need high reliability. Ally Networks thus represent a potential breakthrough to the field of computer-vision navigation. This Small Business Technology Transfer Phase I project will introduce a novel spatial intelligence system, NavigAid, to assist individuals with visual impairment in crucial navigation tasks. NavigAid will generate contextually relevant, task-oriented spatial information from smartphone cameras. With NavigAid, users will be able to navigate independently in unfamiliar, complex environments, thus achieving unprecedented mobility. NavigAid will advance assistive technologies by providing unprecedented services, such as locating objects and generating functionally relevant natural language descriptions of complex environments. The core innovation in this project is Ally Networks, a novel neural network architecture that learns robust spatial semantics rather than 2-dimensional representations. This unique multimodal learning strategy is a high-risk endeavour, with broad impact if successful. Large-scale multimodal learning is difficult, and our technique?s success will revolutionize the state of the art: neural network vision will transform from systems that fail in inscrutable ways, to systems that never fail under circumstances in which human vision would not also fail. Key objectives of this project are to 1) develop and validate Ally Networks on benchmarks, 2) develop spatial problem solvers that address the most pressing needs of users with visual impairment, and 3) develop a test suite to evaluate the spatial problem solvers. This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.

Phase II

Contract Number: 2026027
Start Date: 12/15/2020    Completed: 11/30/2022
Phase II year
2020
Phase II Amount
$999,998
The broader impact/commercial potential of this Small Business Innovation Research Phase II project is to advance computer vision for navigation to support independence for individuals with visual impairment. Roughly eight 8 million individuals in the US live with blindness or visual impairment, which severely reduces a person’s ability to independently navigate in new spaces and consequently limits participation in social and economic life. The lack of independence directly leads to disadvantages in completing formal education and joining the workforce. This project is aimed at creating a breakthrough orientation and mobility system as an application narrating environments and providing the ability to record and retrace routes in indoor environments. The core technological innovation is a multimodal neural network architecture capable of learning robust spatial representations and efficiently operating on mobile devices. The proposed project will introduce a novel artificial intelligence (AI)-driven Orientation and Mobility System to assist individuals with visual impairment in crucial navigation tasks. This proposal will generate contextually relevant, task-oriented spatial information from smartphone cameras. The research tasks include: (1) Development of optimization and data augmentation techniques for improved runtime performance for a broad range of mobile devices; (2) Development of verbal narration of environments as well as route creation and tracking, (3) Co-design and testing with potential users. This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.