Volume XLII-3
Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci., XLII-3, 729-734, 2018
https://doi.org/10.5194/isprs-archives-XLII-3-729-2018
© Author(s) 2018. This work is distributed under
the Creative Commons Attribution 4.0 License.
Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci., XLII-3, 729-734, 2018
https://doi.org/10.5194/isprs-archives-XLII-3-729-2018
© Author(s) 2018. This work is distributed under
the Creative Commons Attribution 4.0 License.

  30 Apr 2018

30 Apr 2018

SENTINEL-1 AND SENTINEL-2 DATA FUSION FOR WETLANDS MAPPING: BALIKDAMI, TURKEY

G. Kaplan1 and U. Avdan2 G. Kaplan and U. Avdan
  • 1Anadolu University, Remote Sensing and GIS programme, 26555 Eskisehir, Turkey
  • 2Anadolu University, Earth and Space Sciences Institute, 26555 Eskisehir, Turkey

Keywords: Image Fusion, Sentinel-1, Sentinel-2, Wetlands, Object-Based Classification

Abstract. Wetlands provide a number of environmental and socio-economic benefits such as their ability to store floodwaters and improve water quality, providing habitats for wildlife and supporting biodiversity, as well as aesthetic values. Remote sensing technology has proven to be a useful and frequent application in monitoring and mapping wetlands. Combining optical and microwave satellite data can help with mapping and monitoring the biophysical characteristics of wetlands and wetlands` vegetation. Also, fusing radar and optical remote sensing data can increase the wetland classification accuracy.
In this paper, data from the fine spatial resolution optical satellite, Sentinel-2 and the Synthetic Aperture Radar Satellite, Sentinel-1, were fused for mapping wetlands. Both Sentinel-1 and Sentinel-2 images were pre-processed. After the pre-processing, vegetation indices were calculated using the Sentinel-2 bands and the results were included in the fusion data set. For the classification of the fused data, three different classification approaches were used and compared.
The results showed significant improvement in the wetland classification using both multispectral and microwave data. Also, the presence of the red edge bands and the vegetation indices used in the data set showed significant improvement in the discrimination between wetlands and other vegetated areas. The statistical results of the fusion of the optical and radar data showed high wetland mapping accuracy, showing an overall classification accuracy of approximately 90 % in the object-based classification method. For future research, we recommend multi-temporal image use, terrain data collection, as well as a comparison of the used method with the traditional image fusion techniques.