Picture this likely scene from a future servicing mission: two unique spacecrafts steadily draw closer to each other in geosynchronous Earth orbit, swiftly slicing through the darkness of space at speeds approaching 7,000 mph. One is tumbling uncooperatively, its behavior erratic and unpredictable. The other, a fully robotic rescue "servicer" steadily controlled by humans on the ground, follows in hot pursuit ready to provide help.
Everything hinges on the servicer's ability to accurately locate, track and dock with the ailing satellite, in spite of the changing distance between itself and its target. However, time delays in human communication, calculation, and commands prevent ground controllers from directing the servicer quickly and precisely enough to execute this mission on their own.
To succeed, the servicer needs a robotic and autonomous rendezvous and docking (AR&D) system - a collection of cameras, sensors, computers, algorithms and avionics that join forces to independently track the uncontrolled satellite at different ranges. Once the target is visually locked into place, the AR&D system can safely "guide and drive" the servicer through precise rendezvous and docking maneuvers with its target satellite.Open- and closed-loop system testing in the Goddard robotic Satellite Servicing Center simulates how a servicer in motion (blue Fanuc robot arm with Argon attached, far left) would track the motion of a non-cooperative, tumbling satellite (gold satellite mockup mounted on motion-based Rotopod platform, far right). Credit: NASA/Chris Gunn
The Satellite Servicing Capabilities Office at NASA's Goddard Space Flight Center in Greenbelt, Md., teamed with several partners on the Argon tests.
Enter Argon, a NASA-developed, ground-based technology development campaign that is helping to mature the individual sensors, algorithms, and system technologies that a satellite would need to perform AR&D on a robotic servicing mission.
Argon is one of NASA's Satellite Servicing Office's efforts to rapidly test, advance, and integrate the technologies essential to making on-orbit robotic servicing a reality.
In addition to advancing satellite servicing technologies, Argon is also playing a role in maturing the Vision Navigation System that will fly on NASA's Orion Multi-Purpose Crew vehicle. In the future, Orion is planned as the primary crew vehicle for missions beyond low Earth orbit.
Argon is a ground-based technology demo and as such, was not designed to fly in space. However, this test bed will help mature individual AR&D technologies, components, and algorithms to make them flight ready.
Argon integrates essential AR&D components and unique algorithms into a system that autonomously images, visually captures and tracks dynamic and static targets.
Demos at various ranges test the components' capabilities and ensure that the system smoothly transitions between each simulated servicing-mission phase.
The Argon module houses a collection of AR&D instruments: cameras, sensors, computers, algorithms and avionics integrated into a single enclosure. After Argon is mounted on either a dynamic or static platform (dolly, robot arm with six degrees of freedom, moving vehicle, etc.), it is aimed at diverse targets to collect data.
Flash lidar: A laser transmitter that flashes pulses of light onto the target (in Argon's case, a mock satellite) to gauge its relative distance.
Vision Navigation Sensor: Known as VNS, this active laser sensor receives and records the flash lidar light that bounces off of the target and produces full-field range (distance) and intensity images used for navigation. A similar unit flew during the 2011 STORRM demonstration on STS-134. Argon is maturing this technology for use in the Orion Multi-Purpose Crew vehicle.
Long-range and short-range optical cameras: The long-range camera acts as a telescope and provides a narrow field of view, while the short-range camera delivers a wide field of view. Together, they deliver reliable optical image data to the SpaceCube processor. Argon uses two of the three optical cameras that flew on the Relative Navigation System (RNS) demonstration on STS-125, the Hubble Space Telescope Servicing Mission 4.
Situational Awareness Camera: A complement to the long- and short-range cameras, this device delivers an image identical to what a human eye would see. Ground operators use this view as a "sanity check" to confirm the SpaceCube processor's analysis.
SpaceCube: A processor that delivers 15 to 25 times the processing power of typical flight hardware. SpaceCube receives the image and sensor data from Argon's optical cameras and VNS and matches it against pre-loaded software models of the target. It then processes algorithms that command Argon to continue following the target's motion.