We propose to build an autonomous multi-UAV/UGV network that can be controlled entirely from broad, customizable operational commands from wearable devices. The UAS requires no direct control inputs (directions, throttle, etc), and does not necessarily require direct commands from an operator to any single vehicle. We will develop a command based human-UAS interfacing protocol that can be used with computers, portable devices, and a wide range of wearables. (point to a spot on a map, gesture over there, gesture open payload hook, gesture patrol squad perimeter etc.). No one has to, or even is, directly linked to controlling a robot. The system takes high level commands, and makes decisions on how to execute them. The exception being that users can assume direct control of an unmanned element if they desire to. The system uses a dynamic, segmented, control hierarchy to govern which system elements (human and computing) have control authority over vehicle flight and subsystems. Control authority can be dynamically transferred to different elements in real-time o Maintains a strict control inheritance to safeguard flight and operations o Sub-system control can be delegated to elements without assigning them actual flight control (Camera control, payload actuator control, etc).Gesture Control,Flight Autonomy,drone,WEARABLE,Hand Gesture,UAV Control,UAV Team,UAS