The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences
Publications Copernicus
Articles | Volume XLII-2/W4
Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci., XLII-2/W4, 129–133, 2017
Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci., XLII-2/W4, 129–133, 2017

  10 May 2017

10 May 2017


B. I. Kryuchkov1, V. M. Usov1, V. A. Chertopolokhov2, A. L. Ronzhin3, and A. A. Karpov3 B. I. Kryuchkov et al.
  • 1Yu. Gagarin Research & Training Cosmonauts Center, Star City Moscow Region, Russia
  • 2Lomonosov Moscow State University, Mathematical-Mechanical Faculty, Moscow, Russia
  • 3St. Petersburg Institute for Informatics and Automation of the Russian Academy of Sciences, St. Petersburg, Russia

Keywords: Lunar Exploration, Extravehicular Activity (EVA), Human-Robot Interaction (HRI), Mobile Robot Control, "Follow Me" Mode, Gesture Interface, Object Tracking, Gesture Recognition, Motion Capture

Abstract. Extravehicular activity (EVA) on the lunar surface, necessary for the future exploration of the Moon, involves extensive use of robots. One of the factors of safe EVA is a proper interaction between cosmonauts and robots in extreme environments. This requires a simple and natural man-machine interface, e.g. multimodal contactless interface based on recognition of gestures and cosmonaut’s poses. When travelling in the "Follow Me" mode (master/slave), a robot uses onboard tools for tracking cosmonaut’s position and movements, and on the basis of these data builds its itinerary. The interaction in the system "cosmonaut-robot" on the lunar surface is significantly different from that on the Earth surface. For example, a man, dressed in a space suit, has limited fine motor skills. In addition, EVA is quite tiring for the cosmonauts, and a tired human being less accurately performs movements and often makes mistakes. All this leads to new requirements for the convenient use of the man-machine interface designed for EVA. To improve the reliability and stability of human-robot communication it is necessary to provide options for duplicating commands at the task stages and gesture recognition. New tools and techniques for space missions must be examined at the first stage of works in laboratory conditions, and then in field tests (proof tests at the site of application). The article analyzes the methods of detection and tracking of movements and gesture recognition of the cosmonaut during EVA, which can be used for the design of human-machine interface. A scenario for testing these methods by constructing a virtual environment simulating EVA on the lunar surface is proposed. Simulation involves environment visualization and modeling of the use of the "vision" of the robot to track a moving cosmonaut dressed in a spacesuit.