This project tests the feasibility of an enhanced location-based services system used by people with visual impairments for navigation and description assistance of a physical environment through an innovative, real-time, wearable, assistive technology. The system is an environment description tool that delivers critical information at the point-of need, providing instant description and navigational data about a location. It enhances traditional gps, augmented reality, and wireless technologies to create a tightly coupled relationship between the user, location, and contextually relevant environmental information. Such data includes key building information, entrances and exits, number of floors, stairs, and location of pertinent features like elevators, doors, restrooms, reception areas, and telephones. Through lightweight wearable computing technologies, environmental data intelligently finds and adapts itself to the user, instead of the user having to adapt and find information in obscure locations. Thus, a user is immediately empowered by accessing interactive mediapoints virtually and spatially positioned throughout the physical environment, providing a user-centric organized description and navigational methodology. The technology is based on a unique, real-time 3d spatial data network system delivering intelligent mediapoints derived from online databases, sensors, and communications through a multi-user wireless network.