SBIR-STTR Award

Virtual Interfaces using Multi-Protocol, Augmented-Reality Activation-Based Control Transfer
Award last edited on: 3/4/23

Sponsored Program
STTR
Awarding Agency
NSF
Total Award Amount
$256,000
Award Phase
1
Solicitation Topic Code
HC
Principal Investigator
Dominick Lee

Company Information

Gyropalm LLC

14429 Catalina Street
San Leandro, CA 94577
   (510) 320-3128
   N/A
   www.gyropalm.com

Research Institution

Purdue University

Phase I

Contract Number: 2151524
Start Date: 9/1/22    Completed: 8/31/23
Phase I year
2022
Phase I Amount
$256,000
The broader impact/commercial potential of this Small Business Technology Transfer (STTR) Phase I project provides virtual interfaces that can be dynamically customized to address the challenge of user adoption in augmented reality (AR) products. This project overcomes previous limitations for AR such as the lack of interoperability, user-acceptance, scalability, and application-aware user intent recognition. While the AR market is projected to grow, wearable collaboration has unexplored potential. Users empowered by this project may be able to easily collaborate alongside skilled experts as well as robotic interfaces in the same medium. Through contextually-aware dynamic controls, manufacturing businesses can lower the risk of human-error while increasing fulfillment capabilities, as employees working from home can stay physically connected and remain engaged with minimal compromise. The AR innovation proposed in this project has application across multiple industries, including manufacturing, training, and emergency simulations.This Small Business Technology Transfer (STTR) Phase I project aims to create a framework for scalable experimentation of virtual interfaces that empower users to search, tag, and store meaningful data using gesture interactions in a secure and private manner. This project seeks to provide context-based, hands-free interaction with wireless devices while minimizing the need to set up sensors (such as optical components or microphones) in controlled environments and providing visual feedback and accessibility to the end user. A wrist-worn wearable captures pre-trained gesture data, and a pair of augmented reality (AR) glasses identifies and analyzes an object within the context of said gestures to control a mechatronic device, such as a robot, to perform occupational tasks. This dynamic process uses factors such as the user’s gaze area, physical location, and computer application. The improvement upon gesture control may allow a separate visual unit, such as the AR glasses, to wirelessly receive and display the gesture intents and perform contextually-aware, meaningful interactions with a clear distinction across a plurality of devices or digitally tagged objects associated with a specific use-case.This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria

Phase II

Contract Number: ----------
Start Date: 00/00/00    Completed: 00/00/00
Phase II year
----
Phase II Amount
----