NextGen will undoubtedly include unmanned aircraft systems (UAS) as legislated under the Federal Aviation Administration Modernization and Reform Act of 2012. The FAA is currently developing the regulatory framework for safely integrating small UAS (sUAS) into routine national airspace System (NAS) operations. The introduction of UAS in the NAS offer advantages over manned aircraft for applications which can be hazardous to human pilots, are long in duration, require greater precision, and require rapid response. Startup UAS companies have proposed using UAS for remote sensing, disaster response, delivery of goods, agricultural support, and many other beneficial applications. One significant aspect in an efficient NAS is the development of autonomous capabilities for UAS and the technologies that supports the safe implementation UAS autonomy. AI proposes to support NASA's UAS autonomy effort by developing an imaging sensor based command and control system that takes advantage of the 3-axis accelerometers that are in smart devices prevalent in the consumer electronics market for autonomous UAS operations. The paradigm shift from human piloted system to an autonomously operated aircraft will require both successful development of sensor technology and image processing techniques to allow for the systematic translation in the human to machine interface. While the majority of current UAS operators are trained UAS pilots, the availability of these specialized pilots may not meet the demand that is anticipated for future commercial UAS companies. Additionally, more intuitive piloting controls are needed to enable wide spread adaption of this technology and the subsequent evolution to a more autonomous operation. The work described in this proposal will provide the foundation to enable development of a UAS with intuitive flight controls