Volume XLII-2/W6
Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci., XLII-2/W6, 379-384, 2017
https://doi.org/10.5194/isprs-archives-XLII-2-W6-379-2017
© Author(s) 2017. This work is distributed under
the Creative Commons Attribution 4.0 License.
Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci., XLII-2/W6, 379-384, 2017
https://doi.org/10.5194/isprs-archives-XLII-2-W6-379-2017
© Author(s) 2017. This work is distributed under
the Creative Commons Attribution 4.0 License.

  24 Aug 2017

24 Aug 2017

PUSHBROOM HYPERSPECTRAL IMAGING FROM AN UNMANNED AIRCRAFT SYSTEM (UAS) – GEOMETRIC PROCESSINGWORKFLOW AND ACCURACY ASSESSMENT

D. Turner1, A. Lucieer1, M. McCabe2, S. Parkes2, and I. Clarke1 D. Turner et al.
  • 1University of Tasmania, School of Land and Food, Hobart, Tasmania, Australia
  • 2King Abdullah University of Science and Technology (KAUST), Thuwal/Jeddah 23955-6900, Kingdon of Saudi Arabia

Keywords: UAS, Push broom, Hyperspectral, Geometric accuracy, PARGE

Abstract. In this study, we assess two push broom hyperspectral sensors as carried by small (10–15 kg) multi-rotor Unmanned Aircraft Systems (UAS). We used a Headwall Photonics micro-Hyperspec push broom sensor with 324 spectral bands (4–5 nm FWHM) and a Headwall Photonics nano-Hyperspec sensor with 270 spectral bands (6 nm FWHM) both in the VNIR spectral range (400–1000 nm). A gimbal was used to stabilise the sensors in relation to the aircraft flight dynamics, and for the micro-Hyperspec a tightly coupled dual frequency Global Navigation Satellite System (GNSS) receiver, an Inertial Measurement Unit (IMU), and Machine Vision Camera (MVC) were used for attitude and position determination. For the nano-Hyperspec, a navigation grade GNSS system and IMU provided position and attitude data.

This study presents the geometric results of one flight over a grass oval on which a dense Ground Control Point (GCP) network was deployed. The aim being to ascertain the geometric accuracy achievable with the system. Using the PARGE software package (ReSe – Remote Sensing Applications) we ortho-rectify the push broom hyperspectral image strips and then quantify the accuracy of the ortho-rectification by using the GCPs as check points.

The orientation (roll, pitch, and yaw) of the sensor is measured by the IMU. Alternatively imagery from a MVC running at 15 Hz, with accurate camera position data can be processed with Structure from Motion (SfM) software to obtain an estimated camera orientation. In this study, we look at which of these data sources will yield a flight strip with the highest geometric accuracy.