Mathematical models of self-orientation are used to predict what a person would feel or perceive in response to a given set of sensory inputs. We propose to develop a user-friendly mathematical model of human spatial orientation perception for applications in piloted aerospace vehicles. Specifically, the model will build upon the latest state-of-the-art models, and will be incorporated into a software platform that supports 1) easy entry of motion cue data by non-expert users, possibly from limited datasets, 2) intuitive graphical aids for visualization of both aircraft and head orientation and predicted orientation perceptions, 3) fundamental advancements in the types of sensory cues included in the model (e.g. somatosensory, tactile, or visual attitude indicator) and their integration. Such a model could be used offline to aid in the investigation of commercial and military aviation mishap analysis, especially for mishaps where the pilot is unable to give a firsthand account of the accident, or used in real-time as part of a new countermeasure system for spatial disorientation (SD).