The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences
Download
Publications Copernicus
Download
Citation
Articles | Volume XLIII-B3-2020
Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci., XLIII-B3-2020, 1219–1227, 2020
https://doi.org/10.5194/isprs-archives-XLIII-B3-2020-1219-2020
Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci., XLIII-B3-2020, 1219–1227, 2020
https://doi.org/10.5194/isprs-archives-XLIII-B3-2020-1219-2020

  21 Aug 2020

21 Aug 2020

GENERATIVE ADVERSARIAL NETWORKS AS A NOVEL APPROACH FOR TECTONIC FAULT AND FRACTURE EXTRACTION IN HIGH-RESOLUTION SATELLITE AND AIRBORNE OPTICAL IMAGES

B. Jafrasteh1,2, I. Manighetti1, and J. Zerubia2 B. Jafrasteh et al.
  • 1Géoazur, Université Côte d’Azur, Observatoire de la Côte d’Azur, IRD, CNRS, Sophia Antipolis, France
  • 2INRIA, Université Côte d’Azur, Sophia Antipolis, France

Keywords: Remote sensing, Deep learning, Curvilinear feature extraction, Image processing, Generative adversarial networks, High resolution, Tectonic fault and fractures, Fault mapping

Abstract. We develop a novel method based on Deep Convolutional Networks (DCN) to automate the identification and mapping of fracture and fault traces in optical images. The method employs two DCNs in a two players game: a first network, called Generator, learns to segment images to make them resembling the ground truth; a second network, called Discriminator, measures the differences between the ground truth image and each segmented image and sends its score feedback to the Generator; based on these scores, the Generator improves its segmentation progressively. As we condition both networks to the ground truth images, the method is called Conditional Generative Adversarial Network (CGAN). We propose a new loss function for both the Generator and the Discriminator networks, to improve their accuracy. Using two criteria and a manually annotated optical image, we compare the generalization performance of the proposed method to that of a classical DCN architecture, U-net. The comparison demonstrates the suitability of the proposed CGAN architecture. Further work is however needed to improve its efficiency.