Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci., XXXIX-B1, 71-76, 2012
© Author(s) 2012. This work is distributed
under the Creative Commons Attribution 3.0 License.
23 Jul 2012
M. Weinmann1, L. Hoegner2, J. Leitloff1, U. Stilla2, S. Hinz1, and B. Jutzi1 1Institute of Photogrammetry and Remote Sensing, Karlsruhe Institute of Technology (KIT), Germany
2Institute of Photogrammetry and Cartography, Technische Universiät München (TUM), Germany
Keywords: Point Cloud, Imagery, Sequences, Multisensor, LIDAR, Thermal, Infrared, Close Range Abstract. Obtaining a 3D description of man-made and natural environments is a basic task in Computer Vision, Photogrammetry and Remote Sensing. New active sensors provide the possibility of capturing range information by images with a single measurement. With this new technique, image-based active ranging is possible which allows for capturing dynamic scenes, e.g. with moving pedestrians or moving vehicles. The currently available range imaging devices usually operate within the close-infrared domain to capture range and furthermore active and passive intensity images. Depending on the application, a 3D description with additional spectral information such as thermal-infrared data can be helpful and offers new opportunities for the detection and interpretation of human subjects and interactions. Therefore, thermal-infrared data combined with range information is promising. In this paper, an approach for mapping thermal-infrared data on range data is proposed. First, a camera calibration is carried out for the range imaging system (PMD[vision] CamCube 2.0) and the thermal-infrared system (InfraTec VarioCAM hr). Subsequently, a registration of close-infrared and thermal infrared intensity images derived from different sensor devices is performed. In this context, wavelength independent properties are selected in order to derive point correspondences between the different spectral domains. Finally, the thermal infrared images are enhanced with information derived from data acquired with the range imaging device and the enhanced IR texture is projected onto the respective 3D point cloud data for gaining appropriate infrared-textured 3D models. The feasibility of the proposed methodology is demonstrated for an experimental setup which is well-suited for investigating these proposed possibilities. Hence, the presented work is a first step towards the development of methods for combined thermal-infrared and range representation.
Conference paper (PDF, 3283 KB)

Citation: Weinmann, M., Hoegner, L., Leitloff, J., Stilla, U., Hinz, S., and Jutzi, B.: FUSING PASSIVE AND ACTIVE SENSED IMAGES TO GAIN INFRARED-TEXTURED 3D MODELS, Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci., XXXIX-B1, 71-76, doi:10.5194/isprsarchives-XXXIX-B1-71-2012, 2012.

BibTeX EndNote Reference Manager XML