Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci., XL-3/W2, 53-58, 2015
http://www.int-arch-photogramm-remote-sens-spatial-inf-sci.net/XL-3-W2/53/2015/
doi:10.5194/isprsarchives-XL-3-W2-53-2015
© Author(s) 2015. This work is distributed
under the Creative Commons Attribution 3.0 License.
 
10 Mar 2015
EXTRACTING SEMANTICALLY ANNOTATED 3D BUILDING MODELS WITH TEXTURES FROM OBLIQUE AERIAL IMAGERY
D. Frommholz, M. Linkiewicz, H. Meissner, D. Dahlke, and A. Poznanska DLR Optical Sensor Systems, Berlin-Adlershof, Germany
Keywords: Aerial, Camera, City, Extraction, Virtual Reality, Point Cloud, Texture Abstract. This paper proposes a method for the reconstruction of city buildings with automatically derived textures that can be directly used for façade element classification. Oblique and nadir aerial imagery recorded by a multi-head camera system is transformed into dense 3D point clouds and evaluated statistically in order to extract the hull of the structures. For the resulting wall, roof and ground surfaces high-resolution polygonal texture patches are calculated and compactly arranged in a texture atlas without resampling. The façade textures subsequently get analyzed by a commercial software package to detect possible windows whose contours are projected into the original oriented source images and sparsely ray-casted to obtain their 3D world coordinates. With the windows being reintegrated into the previously extracted hull the final building models are stored as semantically annotated CityGML ”LOD-2.5” objects.
Conference paper (PDF, 11837 KB)


Citation: Frommholz, D., Linkiewicz, M., Meissner, H., Dahlke, D., and Poznanska, A.: EXTRACTING SEMANTICALLY ANNOTATED 3D BUILDING MODELS WITH TEXTURES FROM OBLIQUE AERIAL IMAGERY, Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci., XL-3/W2, 53-58, doi:10.5194/isprsarchives-XL-3-W2-53-2015, 2015.

BibTeX EndNote Reference Manager XML