Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci., XLI-B7, 493-496, 2016
21 Jun 2016
S. Havivi1, I. Schvartzman2, S. Maman1, A. Marinoni3, P. Gamba3, S. R. Rotman2, and D. G. Blumberg1 1Ben-Gurion University of the Negev, The Department of Geography and Environmental Development, 8410501 Beer Sheva, Israel
2Ben-Gurion University of the Negev, The Electrical Engineering Department, 8410501 Beer Sheva, Israel
3University of Pavia, The Department of Electronics, Via A. Ferrata 1, 27100 Pavia, Italy
Keywords: Change detection, Coherence, Covariance Equalization, Natural hazards, Multi-dimensional data fusion Abstract. Satellite images are used widely in the risk cycle to understand the exposure, refine hazard maps and quickly provide an assessment after a natural or man-made disaster. Though there are different types of satellite images (e.g. optical, radar) these have not been combined for risk assessments. The characteristics of different remote sensing data type may be extremely valuable for monitoring and evaluating the impacts of disaster events, to extract additional information thus making it available for emergency situations. To base this approach, two different change detection methods, for two different sensor's data were used: Coherence Change Detection (CCD) for SAR data and Covariance Equalization (CE) for multispectral imagery. The CCD provides an identification of the stability of an area, and shows where changes have occurred. CCD shows subtle changes with an accuracy of several millimetres to centimetres. The CE method overcomes the atmospheric effects differences between two multispectral images, taken at different times. Therefore, areas that had undergone a major change can be detected. To achieve our goals, we focused on the urban areas affected by the tsunami event in Sendai, Japan that occurred on March 11, 2011 which affected the surrounding area, coastline and inland. High resolution TerraSAR-X (TSX) and Landsat 7 images, covering the research area, were acquired for the period before and after the event. All pre-processed and processed according to each sensor. Both results, of the optical and SAR algorithms, were combined by resampling the spatial resolution of the Multispectral data to the SAR resolution. This was applied by spatial linear interpolation. A score representing the damage level in both products was assigned. The results of both algorithms, high level of damage is shown in the areas closer to the sea and shoreline. Our approach, combining SAR and multispectral images, leads to more reliable information and provides a complete scene for the emergency response following an event.
Conference paper (PDF, 1571 KB)

Citation: Havivi, S., Schvartzman, I., Maman, S., Marinoni, A., Gamba, P., Rotman, S. R., and Blumberg, D. G.: UTILIZING SAR AND MULTISPECTRAL INTEGRATED DATA FOR EMERGENCY RESPONSE, Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci., XLI-B7, 493-496, doi:10.5194/isprs-archives-XLI-B7-493-2016, 2016.

BibTeX EndNote Reference Manager XML