We propose a rapid mapping and 3-D visualization system especially suited for inside buildings, tunnels, urban ?canyons?, and other environments where GPS may be poor or not available. The system--which was fully demonstrated in our Phase I effort-- includes mobile Sensor Nodes that wirelessly supply compressed 3D range data and color imagery to a central Fusion Node. The Fusion Node runs 3D reconstruction and model-building algorithms, fusing data from one or more Sensor Nodes to generate an integrated model of the battlespace. Texture mapped 3D imagery, drawn from this model, is delivered to unit leader and soldier via Dismount Nodes (hand-held devices or Smartphones), allowing overhead and immersive display of the accumulated model. This capability enables a range of activities including mission planning, coordination between troops, and tactical rehearsals. During engagement the system helps increase situational awareness by allowing soldiers to ?look through walls? and, if GPS is available, monitor their fellow team members? locations. Our immersive system is based upon existing, proven technology: The robot, GPS-denied pose system and colorized laser sensor are Carnegie Robotics products. The 3D colorized modeling and dataset registration software sources from our Research Institution partner, Carnegie Mellon University?s NREC.
Keywords: Situational Awareness, Colorization, Gps-Denied Mapping, Sensor Data Fusion, Dataset Registration, 3d Mapping, 3d Slam, Immersive Visualization