The purpose of this project is to develop an eye and voice-operated system to perform computer point-and-click operations in a moving vehicle. The eye and voice- driven protocols will replace operations currently performed by hand via mouse or trackball, which are unwieldy in a moving vehicle environment where it is difficult to maintain steady hand movements. To point and click an icon, the operator simply looks at the icon and speaks a key command word such as "click." Different command words are used to designate alternative click types, such as left click, right click, double click, and drag-and-drop. Ultimately more advanced speech recognition systems will also replace the keyboard for data and text entry. The combined eyetracker and speech recognition system will permit full and efficient control of a computer console without typing or manually manipulating a mouse or trackball. Human Computer Interaction: Command and Control, Situation Awareness, Office Automation, Usability Analysis, Aid for People with Disabilities. Psychological Research: Fatigue Monitoring, Task/Scan Analysis, Advertisement Analysis. Physiological Analysis: Visual Response Testing, Reading Diagnostics.