SBIR-STTR Award

Hokkien Low Density Language Capability
Award last edited on: 4/27/2024

Sponsored Program
SBIR
Awarding Agency
DOD : SOCOM
Total Award Amount
$174,965
Award Phase
1
Solicitation Topic Code
SOCOM232-002
Principal Investigator
Sam Fok

Company Information

Femtosense Inc

903 Sneath Lane Suite 123
San Bruno, CA 94066
   (314) 341-4030
   N/A
   www.femtosense.ai/
Location: Single
Congr. District: 15
County: San Mateo

Phase I

Contract Number: 2023
Start Date: ----    Completed: 8/29/2023
Phase I year
2023
Phase I Amount
$174,965
Femtosense is an innovative tech startup leading the charge to deploy sparse AI using novel techniques that are far more affordable and easier to implement than traditional AI computational technologies. Femtosense’s invention – the Sparse Processing Unit (SPU-001) is a TRL level 7 processing chip poised to disrupt the $50B AI hardware market. It offers best-in-class SWaP-C (Size, Weight, Power, and Cost) tradeoffs and can enable applications other accelerators cannot thanks to first class support for sparse neural networks. Running sparse neural networks, as opposed to dense ones, boasts a 10x reduction in required memory and 100x reduction in power, with a minimal or negligible drop in accuracy. Femtosense designs their own bespoke sparse neural networks for the most challenging but high value applications, many of which offer critical value to soldiers, officers, and other personnel throughout the DoD. Femtosense also offers tools for engineers and researchers to train, optimize, and deploy sparse versions of their own networks, open-source networks, or 3rd party networks, to the SPU through common ML frameworks such as TensorFlow Lite and PyTorch. Femtosense has commercialized this technology and is bringing it to market. SPU-001 has been silicon validated, taped-out in the 22nm ULL process from TSMC. Femtosense is undergoing evaluation with 20+ companies to enable problem-solving applications in hearing aids, TWS earbuds/headsets, always-listening remote controls, TVs and displays, smartphones, and IoT sensors. Femtosense is currently developing SPU-002 for extending the capabilities of SPU-001 to the Computer Vision and Natural Language Processing domains. SPU-002 will be optimized for transformer layers, which are the backbones of SotA large language models (LLMs), making it the best choice for processing SWaP-constrained language workloads like live translation.

Phase II

Contract Number: H9240523P0014
Start Date: 4/1/2024    Completed: 00/00/00
Phase II year
----
Phase II Amount
----