SBIR-STTR Award

Real Time 3-D Modeling and Immersive Visualization for Enhanced Soldier Situation Awareness
Award last edited on: 7/11/2014

Sponsored Program
STTR
Awarding Agency
DOD : Army
Total Award Amount
$599,130
Award Phase
2
Solicitation Topic Code
A12a-T003
Principal Investigator
David Larose

Company Information

Carnegie Robotics LLC

4501 Hatfield Street
Pittsburgh, PA 15201
   (412) 251-0321
   info@carnegierobotics.com
   www.carnegierobotics.com

Research Institution

----------

Phase I

Contract Number: ----------
Start Date: ----    Completed: ----
Phase I year
2012
Phase I Amount
$99,575
We propose a rapid mapping and 3-D visualization system especially suited for tunnels, urban “canyons”, inside buildings and other environments where GPS may be poor or not available. The system includes mobile Sensor Nodes (mobile robots in our Phase I) that wirelessly supply compressed 3D range data and color imagery to a central Fusion Node. The Fusion Node runs 3D reconstruction and model-building algorithms, fusing data from one or more Sensor Nodes to generate an integrated model of the battlespace. Texture mapped 3D imagery, drawn from this model, is delivered to unit leader and soldier via Dismount Nodes (hand-held devices or Smartphones), allowing overhead and immersive display of the accumulated model. This capability enables a range of activities including mission planning, coordination between troops, and tactical rehearsals. During engagement the system helps increase situational awareness by allowing soldiers to “look through walls” and, if GPS is available, monitor their fellow team members’ locations.

Keywords:
Situational Awareness,Colorization,Gps-Denied Mapping,Sensor Data Fusion, Dataset Registration,3d Mapping,3d Slam,Immersive Visualization

Phase II

Contract Number: ----------
Start Date: ----    Completed: ----
Phase II year
2014
Phase II Amount
$499,555
We propose a rapid mapping and 3-D visualization system especially suited for inside buildings, tunnels, urban ?canyons?, and other environments where GPS may be poor or not available. The system--which was fully demonstrated in our Phase I effort-- includes mobile Sensor Nodes that wirelessly supply compressed 3D range data and color imagery to a central Fusion Node. The Fusion Node runs 3D reconstruction and model-building algorithms, fusing data from one or more Sensor Nodes to generate an integrated model of the battlespace. Texture mapped 3D imagery, drawn from this model, is delivered to unit leader and soldier via Dismount Nodes (hand-held devices or Smartphones), allowing overhead and immersive display of the accumulated model. This capability enables a range of activities including mission planning, coordination between troops, and tactical rehearsals. During engagement the system helps increase situational awareness by allowing soldiers to ?look through walls? and, if GPS is available, monitor their fellow team members? locations. Our immersive system is based upon existing, proven technology: The robot, GPS-denied pose system and colorized laser sensor are Carnegie Robotics products. The 3D colorized modeling and dataset registration software sources from our Research Institution partner, Carnegie Mellon University?s NREC.

Keywords:
Situational Awareness, Colorization, Gps-Denied Mapping, Sensor Data Fusion, Dataset Registration, 3d Mapping, 3d Slam, Immersive Visualization