The RINGS project (Resonant Inductive Near-field Generation Systems) was a DARPA-funded effort to demonstrate Electromagnetic Formation Flight and wireless power transfer in microgravity. Integration inconsistencies in both hardware and software prevented the experiment from achieving its objectives during the planned test sessions. A later project supported by NASA ARC focused on the assessment, diagnostics, corrections and ground testing of RINGS, to understand the reasons for the failure of RINGS to complete its science sessions, and assess the possibility of correcting these errors in future missions. The assessment concluded that RINGS can be successfully used in future science sessions provided that a new metrology system is available to navigate RINGS in real time onboard ISS. The proposed study supports the implementation, integration and ground testing of vision-based navigation of RINGS, using the Smartphone Video Guidance Sensor (SVGS) with SPHERES (Synchronized Position Hold Engage and Reorient Experimental Satellite). SVGS was developed at NASA MSFC for application on cubesats and small satellites to enable autonomous rendezvous and capture, and formation flying. SPHERES are free-flying robots that have been used for numerous experiments on board ISS. Their metrology system is based on ultrasonic beacons, and does not operate correctly with large flyers due to multi-path signal reflections. The main objective of this study is the integration of SVGS (as vision-based position and attitude sensor) with the SPHERES GN&C environment. Successful integration will be demonstrated by 3DOF vision-based guidance, navigation and motion control experiments on a flat floor using the RINGS ground units available at Florida Tech. Performance assessment will be done by a vision-based metrology system based on data fusion using high resolution cameras. A path forward for deployment on ISS will be developed in coordination with NASA ARC.
Potential NASA Commercial Applications: (Limit 1500 characters, approximately 150 words) (1) The proposed effort will deliver a positioning/metrology system based on smartphones that can be used for navigation and positioning control applications in space robotics. (2) Orientation and navigation in cubesat and smallsat missions. Automatic docking and maneuvering cubesats can be used for inspection tasks. Cubesats capable of vision-based navigation can be used to perform close-up science missions. (3) Other applications: orbital debris mitigation, cubesat or smallsat formation flying, spacecraft docking, space robotic systems.
Potential NON-NASA Commercial Applications: (Limit 1500 characters, approximately 150 words) 1. The proposed Phase I effort will deliver a positioning/metrology system well suited for navigation and positioning control applications in Robotics when vision-based feedback is desirable, such as in automated docking or inspection tasks. 2. The proposed vision-based GN&C sensor would also be well suited for positioning, navigation and visual inspection tasks in Cubesats.
Technology Taxonomy Mapping: (NASA's technology taxonomy has been developed by the SBIR-STTR program to disseminate awareness of proposed and awarded R/R&D in the agency. It is a listing of over 100 technologies, sorted into broad categories, of interest to NASA.) Command & Control Navigation & Guidance Relative Navigation (Interception, Docking, Formation Flying; see also Control & Monitoring; Planetary Navigation, Tracking, & Telemetry) Robotics (see also Control & Monitoring; Sensors) Teleoperation