Volume XL-3/W2
Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci., XL-3/W2, 53-58, 2015
https://doi.org/10.5194/isprsarchives-XL-3-W2-53-2015
© Author(s) 2015. This work is distributed under
the Creative Commons Attribution 3.0 License.
Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci., XL-3/W2, 53-58, 2015
https://doi.org/10.5194/isprsarchives-XL-3-W2-53-2015
© Author(s) 2015. This work is distributed under
the Creative Commons Attribution 3.0 License.

  10 Mar 2015

10 Mar 2015

EXTRACTING SEMANTICALLY ANNOTATED 3D BUILDING MODELS WITH TEXTURES FROM OBLIQUE AERIAL IMAGERY

D. Frommholz, M. Linkiewicz, H. Meissner, D. Dahlke, and A. Poznanska D. Frommholz et al.
  • DLR Optical Sensor Systems, Berlin-Adlershof, Germany

Keywords: Aerial, Camera, City, Extraction, Virtual Reality, Point Cloud, Texture

Abstract. This paper proposes a method for the reconstruction of city buildings with automatically derived textures that can be directly used for façade element classification. Oblique and nadir aerial imagery recorded by a multi-head camera system is transformed into dense 3D point clouds and evaluated statistically in order to extract the hull of the structures. For the resulting wall, roof and ground surfaces high-resolution polygonal texture patches are calculated and compactly arranged in a texture atlas without resampling. The façade textures subsequently get analyzed by a commercial software package to detect possible windows whose contours are projected into the original oriented source images and sparsely ray-casted to obtain their 3D world coordinates. With the windows being reintegrated into the previously extracted hull the final building models are stored as semantically annotated CityGML ”LOD-2.5” objects.