The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences
Download
Publications Copernicus
Download
Citation
Articles | Volume XXXVIII-1/C22
https://doi.org/10.5194/isprsarchives-XXXVIII-1-C22-247-2011
https://doi.org/10.5194/isprsarchives-XXXVIII-1-C22-247-2011
06 Sep 2012
 | 06 Sep 2012

A UAV BASED CLOSE-RANGE RAPID AERIAL MONITORING SYSTEM FOR EMERGENCY RESPONSES

K. Choi and I. Lee

Keywords: UAV, Multi-sensor, Rapid Mapping, Real-time Georeferencing

Abstract. As the occurrences and scales of disasters and accidents have been increased due to the global warming, the terrorists' attacks, and many other reasons, the demand for rapid responses for the emergent situations also has been thus ever-increasing. These emergency responses are required to be customized to each individual site for more effective management of the emergent situations. These requirements can be satisfied with the decisions based on the spatial changes on the target area, which should be detected immediately or in real-time. Aerial monitoring without human operators is an appropriate means because the emergency areas are usually inaccessible. Therefore, a UAV is a strong candidate as the platform for the aerial monitoring. In addition, the sensory data from the UAV system usually have higher resolution than other system because the system can operate at a lower altitude. If the transmission and processing of the data could be performed in real-time, the spatial changes of the target area can be detected with high spatial and temporal resolution by the UAV rapid mapping systems. As a result, we aim to develop a rapid aerial mapping system based on a UAV, whose key features are the effective acquisition of the sensory data, real-time transmission and processing of the data. In this paper, we will introduce the general concept of our system, including the main features, intermediate results, and explain our real-time sensory data georeferencing algorithm which is a core for prompt generation of the spatial information from the sensory data.