The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences
Publications Copernicus
Articles | Volume XLII-2/W4
Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci., XLII-2/W4, 47–54, 2017
Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci., XLII-2/W4, 47–54, 2017

  10 May 2017

10 May 2017


M. N. Favorskaya1 and A. V. Pyataeva2 M. N. Favorskaya and A. V. Pyataeva
  • 1Reshetnev Siberian State Aerospace University, Institute of Informatics and Telecommunications, 31, Krasnoyarsky Rabochy av., Krasnoyarsk, 660037 Russian Federation
  • 2Siberian Federal University, Institute of Space and Informatics Technologies, 26, Kirensky st., Krasnoyarsk, 660074 Russian Federation

Keywords: Dynamic textures, Convolutional neural networks, Recognition, Categorization

Abstract. Dynamic Texture (DT) can be considered as an extension of the static texture additionally comprising the motion features. The DT is very wide but the weak studied type of textures that is employed in many tasks of computer vision. The proposed method of the DTs recognition includes a preliminary categorization based on the proposed four categories, such as natural particles with periodic movement, natural translucency/transparent non-rigid blobs with randomly changed movement, man-made opaque rigid objects with periodic movement, and man-made opaque rigid objects with stationary or chaotic movement. Such formulation permitted to construct the separate spatial and temporal Convolutional Neural Networks (CNNs) for each category. The inputs of the CNNs are a pair of successive frames (taken through 1, 2, 3, or 4 frames according to a category), while the outputs store the sets of binary features in a view of histograms. In test stage, the concatenated histograms are compared with the histograms of the classes using the Kullback-Leibler distance. The experiments demonstrate the efficiency of the designed CNNs and provided the recognition rates up 97.46–98.32% for the sequences with a single type of the DT conducted on the DynTex database.