K-10 robot

K10 are rovers used to explore planetary surfaces. Each third-generation K10 has four-wheel drive, all-wheel steering and a passive averaging suspension. This helps reduce the motion induced by travel over uneven ground. The K10 has mounting points on its front, back, and bottom that allows for antennas, sensors, and other scientific instruments to be attached. The K10 controller runs on a Linux laptop and communicates via 802.11g wireless, or a Tropos mesh wireless.

10 rovers are prototypical remote-controlled robots that are meant for planetary exploration and terrestrial surveillance. Lighter and more mobile than their Martian counterparts, they weigh about 175 pounds, have top speeds of about 5 miles per hour, but can carry and use up to 30 pounds of scientific instruments. They allow humans to stay out of the vacuum of space while surveying and exploring undiscovered planetary surfaces like Mars. For this reason alone, K10 rovers will be very valuable to the humans that will inhabit Mars eventually.

Mechanical Description
Each third-generation K10 has four-wheel drive, all-wheel steering and a passive averaging suspension. This helps reduce the motion induced by travel over uneven ground. The K10 has mounting points on its front, back, and bottom that allows for antennas, sensors, and other scientific instruments to be attached. The K10 controller runs on a Linux laptop and communicates via 802.11g wireless, or a Tropos mesh wireless.

These rovers are very different from the robotic exploration rovers now active on Mars, these robots were manufactured with robotic follow-up in mind. Robotic exploration and robotic follow-up are different in terms of mechanical constraints, as with robotic follow-up the robots are manufactured to complete human field work, while the robotic exploration rovers were built to explore a planetary surface, untouched by humans.

Imagers
Each K10 Rover has two science imagers: The panoramic imager, and a microscopic imager. Both imagers are used to provide contextual and targeted color imaging of sunlit locations. The panoramic imager, or Pan-Cam for short, is a consumer grade digital Canon PowerShot G9 camera. The microscopic imager (MI) is the same camera model as the Pan-Cam, although the MI is on a fixed mount pointed towards the ground.

3D Scanning Lidar
The K10 has Optech's Intelligent Laser Ranging and Imaging System (ILRIS-3D) on its central mast, approximately 1 m off of the ground. The ILRIS-3D is mainly used for terrestrial survey, taking a 3D scan in about 20 minutes with the average accuracy at around 10 mm at 100 m range.

Ground Penetrating Radar
The K10's ground penetrating radar (GPR) is the Mala X3M, a pulse repetition GPR that can map subsurfaces up to 4 m in depth.

XRF
The Niton XL3T is an x-ray fluorescence (XRF) spectrometer, used for the non-destructive chemical analysis of rocks, minerals, and sediments.

History
The K10 was officially developed by the Intelligent Robotics Group (IRG) at NASA's Ames Research Center, Moffett Field, Calif. IRG used specially designed parts and off-the-shelf components when developing the K10. IRG was funded for this project by NASA's Exploration Technology Development Program (ETDP), which develops and matures technologies to meet the demands of NASA's lunar exploration mission objectives.

The K10 had two major test runs published. One of them was in 2010 in Haughton Crater, Canada (one of the most lunar-like surfaces on Earth), and the other run was where the K10 was created at the Ames Research Center. The 2010 research concluded that "Robotic follow up" is possible for future missions on the Moon or Mars by simulating a test mission, although remotely controlled from nearby. The 2013 experiment was a 100 minute real-time teleoperation of a K10 rover from the ISS, commandeered by astronaut Luca Parmitano. Considered a breakthrough in surface telerobotics, this experiment showed the true potential of executing a low-risk terrestrial survey mission from deep-space or in orbit. This test was the first time NASA's open-source Robot Application Programming Interface Delegate (RAPID) robot messaging system was used to control a robot from space. In addition to completing this, the test presents a potential future mission involving astronauts aboard NASA's Orion spacecraft traveling to the L2 Earth-Moon Lagrange point 65,000 km above the far side of the Moon. From such a location, astronauts could operate a robot remotely to perform surface science work, such as deploying a radio telescope.