ALIGNMENT OF HYPERSPECTRAL IMAGERY AND FULL-WAVEFORM LIDAR DATA FOR VISUALISATION AND CLASSIFICATION PURPOSES
- 1Plymouth Marine Laboratory (PML), Remote Sensing Group, Prospect Place, Plymouth, PL1 3DH, UK
- 2Centre for Digital Entertainment (CDE), University of Bath, Claverton Down Road, Bath, BA2 7AY, UK
Keywords: Integration, Hyperspectral Imagery, Full-waveform LiDAR, Voxelisation, Visualisation, Tree coverage maps
Abstract. The overarching aim of this paper is to enhance the visualisations and classifications of airborne remote sensing data for remote forest surveys. A new open source tool is presented for aligning hyperspectral and full-waveform LiDAR data. The tool produces coloured polygon representations of the scanned areas and aligned metrics from both datasets. Using data provided by NERC ARSF, tree coverage maps are generated and projected into the polygons. The 3D polygon meshes show well-separated structures and are suitable for direct rendering with commodity 3D-accelerated hardware allowing smooth visualisation. The intensity profile of each wave sample is accumulated into a 3D discrete density volume building a 3D representation of the scanned area. The 3D volume is then polygonised using the Marching Cubes algorithm. Further, three user-defined bands from the hyperspectral images are projected into the polygon mesh as RGB colours. Regarding the classifications of full-waveform LiDAR data, previous work used extraction of point clouds while this paper introduces a new approach of deriving information from the 3D volume representation and the hyperspectral data. We generate aligned metrics of multiple resolutions, including the standard deviation of the hyperspectral bands and width of the reflected waveform derived from the volume. Tree coverage maps are then generated using a Bayesian probabilistic model and due to the combination of the data, higher accuracy classification results are expected.