The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences
Publications Copernicus
Download
Citation
Articles | Volume XLII-2/W4
https://doi.org/10.5194/isprs-archives-XLII-2-W4-129-2017
https://doi.org/10.5194/isprs-archives-XLII-2-W4-129-2017
10 May 2017
 | 10 May 2017

SIMULATION OF THE «COSMONAUT-ROBOT» SYSTEM INTERACTION ON THE LUNAR SURFACE BASED ON METHODS OF MACHINE VISION AND COMPUTER GRAPHICS

B. I. Kryuchkov, V. M. Usov, V. A. Chertopolokhov, A. L. Ronzhin, and A. A. Karpov

Keywords: Lunar Exploration, Extravehicular Activity (EVA), Human-Robot Interaction (HRI), Mobile Robot Control, "Follow Me" Mode, Gesture Interface, Object Tracking, Gesture Recognition, Motion Capture

Abstract. Extravehicular activity (EVA) on the lunar surface, necessary for the future exploration of the Moon, involves extensive use of robots. One of the factors of safe EVA is a proper interaction between cosmonauts and robots in extreme environments. This requires a simple and natural man-machine interface, e.g. multimodal contactless interface based on recognition of gestures and cosmonaut’s poses. When travelling in the "Follow Me" mode (master/slave), a robot uses onboard tools for tracking cosmonaut’s position and movements, and on the basis of these data builds its itinerary. The interaction in the system "cosmonaut-robot" on the lunar surface is significantly different from that on the Earth surface. For example, a man, dressed in a space suit, has limited fine motor skills. In addition, EVA is quite tiring for the cosmonauts, and a tired human being less accurately performs movements and often makes mistakes. All this leads to new requirements for the convenient use of the man-machine interface designed for EVA. To improve the reliability and stability of human-robot communication it is necessary to provide options for duplicating commands at the task stages and gesture recognition. New tools and techniques for space missions must be examined at the first stage of works in laboratory conditions, and then in field tests (proof tests at the site of application). The article analyzes the methods of detection and tracking of movements and gesture recognition of the cosmonaut during EVA, which can be used for the design of human-machine interface. A scenario for testing these methods by constructing a virtual environment simulating EVA on the lunar surface is proposed. Simulation involves environment visualization and modeling of the use of the "vision" of the robot to track a moving cosmonaut dressed in a spacesuit.