3D CHANGE DETECTION OF POINT CLOUDS BASED ON DENSITY ADAPTIVE LOCAL EUCLIDEAN DISTANCE

: With the development of sensors and multi-view stereo matching technology, image-based dense matching point cloud data shares higher geometric accuracy and richer spectral information, and such data is therefore widely used in change detection-related research. Due to the inconsistent position and attitude of the image acquisition for generating two phases of point clouds, as well as the seasonal variation of vegetation, the 3D change detection is often subject to false detection. To improve the accuracy of 3D change detection of point clouds in large fields, a method of 3D change detection of point clouds based on density adaptive local Euclidean distance is proposed. The method consists of three steps: (1) Calculating the local Euclidean distances from each point in the second phase of point clouds to the k nearest neighboring points of the first phase of point clouds; (2) Improving the local geometric Euclidean distance based on the local density and performing 3D change detection according to a given threshold; (3) Clustering the change detection results using Euclidean clustering, and then eliminating the false detection area according to the given threshold. The experiments show that the changed region can be better extracted by the proposed method.


INTRODUCTION
Change detection is a method and technique to obtain change information of the objects in the region through data processing and comparison for multiple phase data covering the same area, and to analyze the change information qualitatively and quantitatively (Sirmacek and Unsalan, 2009). As one of the main research directions of remote sensing, change detection is not only important in land management, disaster assessment, and environmental monitoring (Pang et al., 2014), but also indeed helpful for constructing and updating of digital cities and smart cities (Sui et al., 2018), which has attracted a wide range of attention from researchers in recent years.
According to the number of data dimensions, change detection is mainly divided into two aspects: 2D change detection and 3D change detection (Qin et al., 2016). Among them, 2D change detection uses aerial images and satellite images as data source. Although 2D change detection methods have been developed for many years, there are still limitations caused by the influence of perspective distortion, grayscale nonlinearization and shadowing of images (Yang, 2019). 3D change detection is mainly based on Light Detection and Ranging (LiDAR) point clouds and image dense matching point clouds as data source, which can reflect the geometric property of ground objects due to the existence of elevation information in 3D point clouds, and the geometric reflection of changes in ground object s is often less likely to be misjudged and more intense (new construction and demolition of ground objects, etc.), so the 3D change detection is able to obtain better information on changes in ground objects (de Gé lis et al., 2021). With the development of sensor technology and unmanned aerial vehicle (UAV) in recent years, the accuracy and *Corresponding author: Yunsheng Zhang (zhangys@csu.edu.cn) quality of point clouds are getting higher and higher, and the cost have also been reduced significantly, so researchers have started to conduct studies related to 3D change detection (Xu et al., 2021).
According to the detection unit, 3D change detection can be mainly divided into two types of detection methods: point-bypoint and object-based (Qin et al., 2016). Point-by-point 3D change detection usually uses point-by-point computed height difference, Euclidean distance, or graph cut methods to extract candidate change regions, based on which geometric structure, texture, or color information is used for post-refinement processing (Yang et al., 2021). Chaabouni et al. (2010) extracted the change areas by making difference between two phase digital elevation model (DEM), and then used morphological opening and closing processing to optimize the detection areas. Teo et al. (2013) extracted vegetation and buildings from digital surface models (DSM) based on surface roughness, and then used geometric analysis to detect building changes and determine the type of changes. Du et al. (2016) fused the height difference feature of LiDAR point clouds with the color feature of aerial images and extracted the change areas using the graph-cut algorithm. Pang et al. (2018) generated DSM using dense matching point clouds, and then calculated nDSM and dDSM as features to extract changed areas using the grab-cut algorithm. The point-by-point 3D change detection methods have high requirement on the quality and alignment of point clouds, and there are limitations of this type of methods for the seasonal change of vegetation and other areas.
Object-based 3D change detection methods usually require classifying data in different phases and then comparing the classification results of point clouds for change areas extraction. Matikainen et al. (2010) extracted buildings using decision tree based on aerial laser scan data and images, and then compare the extracted buildings with existing building maps to identify change buildings. Pang et al. (2014a) used point clouds to generate DSM, then smoothed the buildings in the region based on the connectivity analysis technique, and finally obtained the changed areas based on the building extraction results. Qin et al. (2015) performed object segmentation with elevation constraints based on DSM-assisted images generated from multi-view dense matching point clouds. Then they used support vector machine (SVM) to classify the segmentation results and merged the segmented objects of the same category, based on which the initial change index was calculated, and then updated according to the mutual coverage of the segmented objects after merging different phases, and finally the changed areas were extracted according to the double threshold judgment on the change index.
Compared with the point-by-point 3D change detection, the object-based 3D change detection has lower requirement in the quality and alignment of the point clouds, but its detection accuracy is affected by the accuracy of classification and segmentation.
In this paper, we proposed a 3D change detection method for point clouds based on density adaptive local Euclidean distance. First, search the k nearest neighboring points from the first phase point clouds for each point in the second phase point clouds, and then calculate local Euclidean distance and local density; Second, extract the candidate change areas based on the given threshold; finally, the pseudo change areas are eliminated from the candidate change areas by using Euclidean clustering and top surface analysis. The main workflow of the method is shown in Figure 1.  Figure.1 Workflow of the proposed method 2. METHOD

ICP Point clouds Registration
Currently, both LiDAR point cloud data and image-based dense matching point cloud data are recorded in local coordinate system and only fine registration is needed, instead of initial alignment. The ICP (Iterative Closest Point) algorithm is a point clouds refinement registration algorithm proposed by Besl and Mckay in 1992, also known as iterative latest point algorithm, widely used in point clouds based registration (Besl and McKay, 1992b). The ICP algorithm is independently updating corresponding points of the point cloud data, so that the rigid body transform matrix between the two points clouds to minimize the distance between the two point clouds. Suppose there are initial point clouds P and target point clouds Q, according to certain constraints, we need to find the nearest point ( , ), and utilize the least squares to calculate the best matching parameter and so that the error function is minimized by equation (1): (1) The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, Volume XLIII-B2-2022 XXIV ISPRS Congress (2022 edition), 6-11 June 2022, Nice, France Where is the rotation matrix, is the translation parameter, is a point in the point clouds P, is the neighboring point of in the point clouds Q.
The ICP algorithm has the following steps: (1) Randomly select a portion of the point clouds M from point clouds P; (2) Searching for the nearest point in point clouds Q as the corresponding point clouds M; (3) Remove points with the distance value greater than average distance, and the remaining point is the final corresponding point; (4) Calculate the coordinate conversion matrix (quad number or SVD) by the corresponding point; (5) Constructing and calculating error evaluation functions; (6) Determining whether the value of the error evaluation function meets the required accuracy, if it is satisfied, stop the algorithm iteration; otherwise, turn to the step (1), and update the target point clouds at this time.

Local European Distance:
At present, the point-bypoint 3D change detection methods usually only consider the measurement between the pending point and a single nearest neighboring point as a measure of the change detection. When there are some noise points in the point clouds, detection accuracy is difficult to guarantee. In order to reduce the interference of noise points, the proposed method uses the average European distance of k nearest neighboring points as a measure of change detection. There are some special cases when searching for k nearest neighboring points in 3D directions, and when the detection point is located on the edge of the changed building, the search result always located on the wall (as shown in Figure 2(a)), which caused the calculated distance is smaller than the true distance (as shown in Figure 2(b)). In order to solve this problem, a k nearest neighboring searching method in horizontal direction is introduced, but this search mode does not apply to all regions, such as the oblique region (as shown in Figure 2 (c)). The distance between the neighboring point and the pending point is often greater than the real situation (as shown in Figure 2 (d)), which causes false detection. Therefore, it is necessary to determine the state of the point, so that is to search for adjacent points in the 3D direction or in the horizontal direction: if the angle between the point to be detected and the vertical direction is less than , search for k nearest neighboring points in the horizontal direction, otherwise search for k nearest neighboring points in the 3D direction.
When searching for k nearest neighboring points in the horizontal direction, it is possible to search for multiple neighboring points with different elevation values due to the influence of vegetation and complex buildings, which can also lead to false detection in some areas. Therefore, it is necessary to process different situations: (1) Euclidean clustering is performed on the k nearest neighboring points obtained from the search results, and if the number of clusters is greater than 1, the search result is considered to have searched for multiple points with different elevation values; (2) If the number of clusters is 2 , and angle between the normal and vertical direction is smaller than , it is considered that the k points is the edge portion of the artificial object, and the farthest cluster is selected; (3) If the category is greater than 2, it is considered to be vegetation, The area is taken close to the cluster to be detected.
When searching for neighboring points in the horizontal direction, it is possible to search for multiple neighboring points on the elevation due to the influence of vegetation and complex buildings, which can also lead to false detection in some areas. Therefore, different cases need to be handled: (1) Euclidean clustering is performed on the k neighboring points obtained from the search, and if the number of clusters is greater than 1, the points at multiple elevations are considered to be searched; (2) if the number of clusters is 2 and the angle between the normal vector and the vertical direction is less than , the k points are considered to be the edge part of the artificial objects, and the cluster farthest from the point to be detected is selected; (3) if the point category is greater than 2, it is considered as vegetation area and the cluster closest to the point to be selected. The local Euclidean distance can be calculated by equation (2): where is the local Euclidean distance, is the number of nearest neighboring points, 1 is a neighboring point of 2 in the first point clouds, 2 is a point to in the second point clouds, ( , , ) is the coordinate of 1 , and ( , , ) is the coordinate of 2 .

Density Adaptive Local Euclidean Distance:
In the case that the density of two phase point clouds is not same or there are errors in registration, the distance between the point from the second phase point clouds and the neighboring point obtained from the first phase point clouds is often greater than the distance to the true neighboring point, so the local Euclidean distance is improved in this paper using local density. The calculation of the local density requires the center of k nearest neighboring points. The center of k nearest neighboring points can be calculated by equation (3): After obtaining the center of k nearest neighboring points in the first phase point clouds, search its k nearest neighboring points in the first phase point clouds, and then the local density can be calculated by equation (4): The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, Volume XLIII-B2-2022 XXIV ISPRS Congress (2022 edition), 6-11 June 2022, Nice, France where is the local density, is the max distance from the neighboring points to the center.
After obtaining the local density, we can use it to improve the local Euclidean distance by equation (5): where is the density adaptive local Euclidean distance. If the density adaptive local Euclidean distance of the point is greater than the given threshold , it is judged as a candidate change point, otherwise it is considered as unchanged point.

Remove False Detection Area
Due to the image being obscured or the image overlap ratio is insufficient, resulting in the deformation of some building walls in the image dense matching point clouds, these areas are often mistakenly detected as changed areas, in order to solve this problem, the points in the candidate change areas need to be rejudged: (1) Search for k nearest neighboring points in the horizontal direction of the candidate change point, and then use Euclidean clustering to these k neighboring points clustering; (2) If there are multiple clusters and the number of clusters is less than 4, search whether there is a point with an angle between the normal and the vertical direction less than , if it exists, take the largest elevation point in this cluster as the top point of the current point, and judge the state of the current point by the change state of this point; (3) if there are multiple clusters and the number of clusters is greater than 3, search the point which has the largest elevation value in the k nearest neighboring points, if the point has changed, the state of the current point remains changed, and if the point has not changed, turn the state of the current point to unchanged.
Due to the limitation of image resolution and dense matching algorithm, some vegetations area of the point clouds also has some distortion. In order to reduce the false detection due to vegetation distortion, the candidate change points need to be Euclidean clustered, and the area of each cluster is estimated, and if the area of the cluster is smaller than the given area threshold , removes it from the candidate changed points. The area of each cluster can be estimated by equation (6). = ( ̅ ) 2 (6) where S is the estimated area of the cluster and N is the number of points in that cluster. If the calculated area is smaller than the set area threshold , it is considered as a pseudo-change region and is rejected from the candidate changed points.

Experimental data
In order to verify the effectiveness of the proposed method, we take experiments with the image dense matching point cloud data acquired in June 2020 and September 2020, and the data contained three experimental areas, as shown in Figure 3. The details are as follows: (1) The first experimental area (area 1) is shown in the first column of Figure 3, and the first phase point clouds contain 2725260 points with a density of about 12pts/m 2 , as is shown in Figure 3(a); and the second phase point clouds contain 2767663 points with a density of about 12pts/m 2 , as is shown in Figure  3(d).
(2) The second experimental area (area 2) is shown in the second column of Figure 3, and the first phase point clouds contain 2808883 points with a density of about 12pts/m 2 , as is shown in Figure 3(b); the second phase point clouds contain 2729196 points with a density of about 12pts/m 2 , as is shown in Figure  3  In order to quantitatively evaluate the proposed method, we manually selected the changed areas in the three experimental areas as references. Due to the existence of vegetation, soil and vehicles in the experimental areas, and the limitations of resolution, alignment accuracy, point clouds accuracy and subjective factors, we only selected some areas with significant changes, and excluded the points with density adaptive distance less than 0.5m in the changed areas.
The quantitative evaluation metrics used in this paper include precision, recall and F1-score.
Where is the number of points in the correctly detected changed region, is the number of points in the pseudo-changed region, and is the number of points that mistakenly detect the changed region as the unchanged region.

Experimental results and discussion
3D change detection was carried out for area 1, 2 and 3 using the proposed method, and the relevant parameters were set as follows: the number of searched nearest neighboring points was 15; the threshold value of the angle between the normal vector and the lead hammer direction was 20°; the threshold value of the density adaptive local Euclidean distance was 1.2 m; and the threshold value of the area was 12 m 2 .
3D change detection results using the proposed method are shown in Figure 5. Overall, the proposed method not only detected the large buildings that have changed more completely, but also was able to detect some small artificial objects that have changed, and the accuracy, recall and F1-score of the three areas all reach more than 80%. Since the three experimental areas differ in terms of ground object type and distribution, and the result of change detection are different, further analysis of each of the three experimental areas will be conducted in four aspects: large buildings, small artificial objects, bare ground and vegetation. For the area 1, the proposed method completely detected the large changed building in the middle area, but only part of the small artificial objects in the lower left corner were detected. After observing the original data, we found that the area of these objects was too small, which led to the false detection as unchanged. There are also some irregularly shape areas in the detection results, and after comparing with the original data, most of these areas are from vegetation and bare soil, and the local Euclidean distance of these areas are larger than 1.2m, it is largely the influence of natural and human factors that lead to such a change situation.
For the area 2, the proposed method not only detected the large building in the upper right, but also detected most of the small artificial objects. In the lower left, there are some areas with irregular shape, these areas are also mainly the bare soil changed, similar to the area 1, and it is also caused by natural and human factors.
For the area 3, the proposed method completely detected the large changed building in the second phase point clouds, but a part is missing in the change detection result of the first phase point clouds. After comparing with the original point clouds, we found that this part is caused by the recessed wall of the building, and the ground point under the building is reconstructed, so the local Euclidean distance is smaller when compared with the second phase point clouds, therefore mistakenly detected this part as unchanged area. Most of the artificial objects were detected, and the parts where there was a large variation in soil were also detected.
In order to objectively verify the effectiveness of the proposed method, we also used the traditional Euclidean distance segmentation change detection method to detect changed areas in the three experimental areas. Figure 6 is the change detection result using traditional Euclidean distance segmentation, and the corresponding accuracy assessment is as shown in Table 2.
Compared with the proposed method, the change detection accuracy of the traditional Euclidean distance segmentation change detection method is smaller, both in terms of the change detection accuracy of the first phase point clouds and the change detection accuracy of the second phase point clouds. At the same time, because the traditional Euclidean distance segmentation The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, Volume XLIII-B2-2022 XXIV ISPRS Congress (2022 edition), 6-11 June 2022, Nice, France change detection method has defects when searching the neighboring points as shown in Figure 2(a), which caused its extremely low accuracy of the first phase point clouds change detection.
For both the proposed method, and the traditional Euclidean distance segmentation change detection method, it is obviously to see the overall accuracy of the second phase point clouds change detection result is less than that of the first phase point clouds change detection result, the main reason is that the three experimental areas have a large number of removed artificial objects, so the second phase change detection results have a large number of ground points, and the number of ground points is much smaller than the points on the wall, which also led to the misjudgment of the changed area because of the smaller area.
The effectiveness of the proposed change detection method depends on the value of the density adaptive local Euclidean distance threshold . Figure 7 shows when fixed , the variation results of precision, recall and F1-score with changing . For all three experimental areas, precision decreases with increasing , recall increases with increasing , and F1-score increases first and then decreases with increasing , which we can obtain a max F1-score.
In the experimental data of this paper, the recommended density adaptive local Euclidean distance threshold is 1.2m and the area threshold is 12m 2 . However, in practical applications, there is usually no ground truth data of point clouds to determine the best threshold. In this paper, we believe that the size of and depends on the quality of point clouds and the interested object, and we can't define the best fixed threshold value. In the case of point clouds with quality and small registration error, if we want to detect more small changed artificial objects, then the values of , need to be set smaller, but smaller , may lead to the existence of vegetation, bare soil and other objects in the results; on the contrary, if we only focus on the large changed buildings and some large changed artificial objects, then the values of , need to be set larger, but larger , may lead to insufficient details in the detection results. If the point clouds quality is poor and the registration error is large, then the set values , must be large enough, because smaller , will lead to a large number of false detections in the detection results.

CONCLUSIONS
In this paper, we proposed a density adaptive local Euclidean distance 3D change detection method for detecting changed areas in point clouds. This method uses normal vectors to judge the detected points in order to select the best neighboring point, so that the calculated local Euclidean distance between two points can better reflect the real change; at the same time, considering the different density between the two point clouds, the calculated local Euclidean distance is often slightly larger than the real Euclidean distance, so the local Euclidean distance is improved by the local density; furthermore, the use of top judgment of the point to can well reduce the false detection caused by the deformation of building in the point clouds; finally, the Euclidean clustering is used to remove small objects. The experimental results show that with reasonable threshold settings, the proposed method can achieve good change detection accuracy.
There are also some shortcomings of the proposed method, for example, the change detection accuracy is extremely dependent on the value of and , and the detection result may not be well if the value of and do not match the change region of interest. In the future, we will improve the proposed method, such as adding color information and geometric feature (a) (b) (c) (d) (e) (f)