The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences
Download
Publications Copernicus
Download
Citation
Articles | Volume XLVIII-4/W1-2022
Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci., XLVIII-4/W1-2022, 51–58, 2022
https://doi.org/10.5194/isprs-archives-XLVIII-4-W1-2022-51-2022
Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci., XLVIII-4/W1-2022, 51–58, 2022
https://doi.org/10.5194/isprs-archives-XLVIII-4-W1-2022-51-2022
 
05 Aug 2022
05 Aug 2022

LAYING THE FOUNDATION FOR AN ARTIFICIAL NEURAL NETWORK FOR PHOTOGRAMMETRIC RIVERINE BATHYMETRY

E. Belcore and V. Di Pietra E. Belcore and V. Di Pietra
  • Politecnico di Torino, DIATI, Department of Environment, Land and Infrastructure Engineering. Corso Duca degli Abruzzi, 24, 10129 Torino, Italy

Keywords: UAS, Riverine bathymetry, submerged topography, artificial intelligence, artificial neural network, channel morphology, SfM, Python, Digital Twin Earth

Abstract. This work aims to test the effectiveness of artificial intelligence for correcting water refraction in shallow inland water using very high-resolution images collected by Unmanned Aerial Systems (UAS) and processed through a total FOSS workflow. The tests focus on using synthetic information extracted from the visible component of the electromagnetic spectrum. An artificial neural network is created using data of three morphologically similar alpine rivers. The RGB information, the SfM depth and seven radiometric indices are calculated and stacked in an 11-bands raster (input dataset). The depths are calculated as the difference between the Up component of the bathymetry cross-sections and the water surface quotas and constitute the dependent variable of the regression. The dataset is then scaled. The observations of one of the analyzed case studies are used as the unseen dataset to test the generalization capability of the model. The remaining observations are divided into test (20%) and training (80%) datasets. The generated NN is a 3-layer MLP model with one hidden layer and the Rectified Linear Unit (ReLU) and sigmoid activation functions. The weights are initialized to small Gaussian random values, and kernel regularizers, L1 and L2, are added to reduce the overfitting. Weights are updated with the Adam search technique, and the mean squared error is the loss function. The importance and significance of 11 variables are assessed. The model has a 0.70 r-squared score on the test dataset and 0.77 on the training dataset. The MAE is 0.06 and the RMSE 0.08, similar results obtained from the unseen dataset. Although the good metrics, the model shows some difficulties generalizing swallow depths.