ACCURACY ANALYSIS OF REAL-TIME OBJECT POSITIONING WITHOUT GCP FOR IMAGES FROM UAV OBLIQUE ULTRA-LONG FOCAL SMALL VIEW FIELD WHISKBROOM CAMERA SYSTEM

Using images from UAV oblique ultra-long focal small view field whiskbroom camera (ULF-SVF-WC) system for object positioning and mapping is more difficult than conventional aerial photogrammetry, for the particularity of oblique ULF-SVF-WC imaging mode. Therefore, the precision and accuracy of its object positioning are also quite different from that of the conventional UAV photography. In this paper, we analysed the accuracy of real-time object positioning without ground control points (GCPs) for images from UAV oblique ULF-SVF-WC System. Firstly, we studied the imaging principles and characteristics of the oblique ULF-SVF-WC system. Then, we established the coordinate transformation relationship from the object point to the image point and constructed a strict imaging model for oblique ULF-SVF-WC image, which was used for real-time single-ray back-projection positioning assisted by DEM. Thirdly, we quantitatively analysed the distribution and variation of the oblique ULF-SVF-WC image single-ray back-projection errors in theory based on error propagation law and simulation data. Finally, we conducted the experiment of real oblique ULF-SVFWC flight images for the actual positioning accuracy analysis. The experiment results showed that: the influence of system error on positioning generally conforms to the distribution and variation of theoretical precision, that is, the accuracy of scanning direction is much lower than that of the flight direction; while the accuracy of actual single-ray back-projection positioning is evidently lower than that of the theoretical analysis and there are obvious system errors in the positioning residuals. It indicates that the 6 external orientation elements calculated from POS data contains obvious system error whose influence is greater than random error and this should be eliminated in real-time single-ray back-projection for object positioning.


INTRODUCTION
The UAV oblique ultra-long focal small view field whiskbroom camera (ULF-SVF-WC) system can obtain the large-scale observation information far away from the nadir point, and images with high resolution (for long focal), which make up for the shortcomings of the conventional low-altitude UAV photography and has been widely used in many fields (such as environmental monitoring, surveying and mapping of difficult areas).However, with the large oblique whiskbroom imaging mode, images collected usually have small area, large deformation and very low overlaps, which makes it very difficult to accomplish object positioning than conventional UAV aerial photography.Previous studies mainly focused on target detection and interpretation.Now, with the development of photogrammetry, researchers have paid more attention to study how to get the 3D coordinates of ground objects in object space from this kind of images.For positioning with POS data and DEM, the 3D coordinates of ground objects can be obtained in real time theoretically without triangulation, and this positioning method has its advantages in adaptability and time cost for areas where it is difficult to measure ground control points (border dispute areas or areas with poor natural environment) (Iyengar Mrinal, Lange Davis.,2006).
In the past, airborne area-array whiskbroom sensors were mainly used in the fields of rapid ground observation, military reconnaissance and so on, such as the A3 Edge camera of Vision Map company, the DB-110 camera of Goodrich company, the CA-295 camera of ROI company and the Global Eagle camera of Raytheon company of the United States (Iyengar Mrinal, Lange Davis.,2006;Olson G G.,2002).The performance parameters of several typical airborne area-array whiskbroom sensors are shown in Tab.1(In the table, "VL" represents the "visible light" camera lens and "IL" represents the "Infrared Light" camera lens) (Iyengar Mrinal et al, 2006;Thomas Parsonage.,2008;ROI company,2002).
For the airborne area-array whiskbroom sensor, former researches mainly focused on the design and manufacture, while there was little research on geometric processing, especially the object positioning.Mo Delin et al. conducted the geometric processing technology for long focal oblique linear-array camera (Mo Delin et al.,2018;Mo Delin et al.,2018;Mo Delin et al.,2018).Zhang Zhenghao et al. studied the geometric processing method of area-array whiskbroom images obtained by A3 aerial camera (Zhang Zhenghao et al.,2019); Zhu Shulong et al. studied the external orientation elements solution method for large oblique area array CCD images (Zhu Shulong et al.,1999); Wang Junhua et al. analysed the high-altitude whiskbroom imaging process, imaging model, image deformation and calculation in CCD cameras (Wang Junhua et al.,2007).However, the above researches mainly focused on the whiskbroom camera system whose whiskbroom angle is small and the focal length is short, while there is little research on the oblique ULF-SVF-WC system, especially in the object real-time positioning method and accuracy analysis.Due to its complex imaging mode, image positioning and accuracy analysis of the oblique ULF-SVF-WC system are more difficult than both nominally vertical aerial photography and conventional short focal UAV aerial photography, which is still an unsolved problem.Therefore, the object positioning accuracy of images from the oblique ULF-SVF-WC system is studied in this paper.We discussed the imaging characteristics and imaging model of the oblique ULF-SVF-WC system, analysed the accuracy of single-image realtime positioning theoretically according to the error propagation law, and used a set of data to verify the theoretical analysis in experiment.The article is mainly divided into four parts: the first part mainly introduces the imaging characteristics of the oblique ULF-SVF-WC system and its strict imaging model construction process; the second part shows the principle of real-time single-image object positioning of this kind of camera system, and gives the analysis of theoretical result according to the error propagation law; the third part introduces the experiment of the real flight data to verify the rationality of the theoretical accuracy analysis; in the fourth part, the theoretical accuracy and practical accuracy of real-time single-ray back-projection of the oblique ULF-SVF-WC system are summarized.

Imaging Principle
In our research, the UAV is equipped with an ULF-SVF-WC system and a POS device.In the ULF-SVF-WC system, the camera swings around an axis and scans to a side perpendicular to the flight direction during imaging and in the meantime the POS data is obtained.The imaging mode is shown in Fig. 1.In a single whiskbroom cycle, the camera swings back after scanning and shooting in one direction and the images form a stepped strip.
After the imaging of a whiskbroom strip, the lens quickly swings back to the next whiskbroom cycle (the camera also imaging during swing back), and the final images form an image area with small overlaps.
There are characteristics of the images taken by the oblique ULF-SVF-WC system: each image that makes up the observation area is very small; geometric distortion is large (due to large inclination); the overlap is low; the base-height ratio between image pairs is small.The parameters of the sensor system are following: the CCD pixel size is 5.5μm, the format of a single image is 1920*1080(pixels), and the image size is 10.56mm*5.94mm; the focal length  = 1000 , the maximum whiskbroom angle can reach about 66°, and the flight altitude is about 4; the image overlap in the scanning direction is about 20%, and in the flight direction(the direction perpendicular to the scanning direction ) is about 30%, both are very small; the base-height ratio of image pairs is very small, less than 1/2000.In the process of small view field whiskbroom imaging, each small area array satisfies a collinearity equation and the movement of the electro-optic aiming line (scanning mirror) can be described as rotation between coordinate systems.Therefore, for each small area array, the position of perspective centre obtained by GPS and the three angles of pitch, roll and yaw obtained by Inertial Measurement Unit (IMU) in POS system should be combined with the roll angle   and the pitch angle   of electro-optic aiming line, to obtain 6 external orientation elements (3 elements for position and 3 angles for attitude at the exposure time).These 6 external orientation elements are used to establish the strict mathematical relationship between the image point (, , −) and the corresponding ground point (, , ), which is the basis of object positioning calculation.

Image Coordinate System 𝒊:
The origin is located at the upper left corner of the image, set the moving direction of the platform as axis  and the camera whiskbroom direction as axis .The horizontal coordinate and the vertical coordinate can be obtained according to the number of columns  and rows  of the image point.

Sensor Coordinate System 𝒄:
The origin is located at the perspective centre ,   axis points to the moving direction of the platform,   axis is the reverse direction of the scanning direction and   axis is determined according to the right-hand rule.It is similar to the image space coordinate system in photogrammetry.

Body Coordinate System 𝒃 :
The origin is located at the geometric centre of the IMU, and its coordinate axis is the same as that of the gyroscope.Generally, the body coordinate system is usually divided into two forms: front-right-lower and right-front-upper.We adopt the right-front-upper form, that is,   axis points to the moving direction,   axis is perpendicular to the moving direction which points to the right, and   axis is determined by right-hand rule.

Navigation Coordinate System 𝒏 :
The origin is located at the centroid of the aircraft, the   axis points east along the direction of the prime vertical circle of the reference ellipsoid, the   axis points north along the direction of the meridian circle of the reference ellipsoid, and   axis points to the sky along the normal direction of the reference ellipsoid, which is also known as the east-north-upper coordinate system (ENU).

Earth-centred Earth-fixed Coordinates (ECEF) 𝒆 :
There are two forms of ECEF coordinate system: space rectangular coordinate system and geodetic coordinate system.In the space rectangular coordinate system, the origin is located in the centroid of the earth, the   axis points to the Arctic, the   axis points to the intersection point between the average meridian plane of Greenwich and the agreed equatorial plane of the earth, and the   axis constitutes the right-handed coordinate system with the   and the   .In the geodetic coordinate system, the latitude  is the angle between the ellipsoidal normal line and the ellipsoid equatorial plane, the longitude  is the angle between the meridian plane of the ellipsoid and the meridian plane of Greenwich, and the geodetic height is the distance from the ground point along the ellipsoidal normal to the ellipsoidal surface.

Tangent Plane Rectangular Coordinate System 𝒎:
The origin is located at the nadir point of the aircraft centre; the   axis points to the outside direction of the normal line of the ellipsoid; the   axis is in the geodetic meridian plane, orthogonal to the   axis and points to the north; the   ,   and   axis constitute the right-handed coordinate system.

Gauss-Kruger Projection Coordinate System:
Generally, we take the Gauss-Kruger projection coordinate system (taking projection zone of 3° for an example in this paper) as the topographic survey coordinate system. axis is the central meridian of the three-degree belt where the ground objects are located, and its intersection with the equator is the origin.

The Strict Imaging Model
The key to build a strict imaging model based on coordinate transformation is to determine the rotation matrix, as shown in Fig.
2：  The specific conversion steps are as follows: Step 1: From Image Coordinate System to Image Space Coordinate System.The image coordinate of a point  is as formula (1) : where (, ) = the image scanning coordinate of point  ( 0 ,  0 ) = the scanning coordinate of the image centre  = the pixel size The coordinate of point in the image space coordinate system is (, , −), and  is the principal distance of the camera.
Step 2: From Image Space Coordinate System to IMU Body Coordinate System.The rotation matrix relative to the IMU body coordinate system is determined according to the roll angle   and the pitch angle   of each area array at the exposure time: where (   ,   , −) is the image space coordinates of point , and  is the corresponding object point of image point .
Step 3 ： From IMU Body Coordinate System to Navigation Coordinate System (ENU).By the formula (4), the rotation matrix    is calculated through the three angles obtained in the POS: pitch angle  , roll angle  and yaw angle :

ENU T T T IMU cos
sin 0 1 0 0 cos 0 sin = sin cos 0 0 cos sin 0 1 0 0 0 1 0 sin cos sin 0 cos cos cos sin sin sin sin cos cos sin sin sin cos sin cos cos sin sin cos And the location of point P in Navigation Coordinate System is as the formula ( 5): Step 4: From Navigation Coordinate System (ENU) to ECEF Coordinate System.The rotation matrix is as the formula (6): sin cos sin cos cos cos sin sin sin cos 0 cos sin Where  ,  = the longitude and latitude of the current photographing station which obtained by the POS system. Step where  0 ,  0 = the longitude and latitude of the origin of the tangent plane rectangular coordinate system.Then the position of the ground point  in the tangent plane coordinate system is calculated as the formula (8): Where  S ,  S ,  S = the object coordinates of the camera station.
Step 6: From Local Tangent Plane Coordinate System to Measuring Coordinate System.In photogrammetry, the direct object positioning is usually realized in the national surveying coordinate system, which is generally the Gauss-Kruger projection coordinate system.Therefore, it is also necessary to transform the plane coordinate system to the 3° Gauss-Kruger projection coordinate system, in which the influence of earth's curvature and meridian deviation on angles must be considered, that is, the compensation matrixes need to be added to the original rotation matrix for earth's curvature and meridian deviation compensation(Yuan Xiuxiao et al.,2011).Then, the transformation formula of rotation matrix is extended to formula (9): where   ,   ,   = the object coordinates of point ,  = the scale factor.The strict imaging model can be expressed as the classical collinear equation as formula ( 11): where  0 ,  0 = the offsets from the image centre to the principal point.

ACCURACY ANALYSIS OF REAL-TIME OBJECT POSITIONING BASED ON SINGLE IMAGE
In aerial photogrammetry, real-time single image direct object positioning is based on single-ray back-projection, which determines the position of ground object points using the 6 image exterior orientation elements obtained by POS combined with the DEM data.The error sources affecting the object positioning mainly include camera station (perspective centre) error, imaging attitude error, image point coordinate measurement error, focal length (principal distance) error and etc.

Principle of Single Image Positioning
Single image positioning based on DEM is mainly divided into two steps: firstly, the pose data obtained by POS is combined with the whiskbroom angle of each image, to determine the 3 position elements and rotation matrix of each image (as the formula ( 9)); then, the coordinate (, ) of an image point and the focal length  are combined with DEM data to determine the coordinate (, ) of the ground point, as shown in formula (12).
The process is a back-projection calculation process, using DEM data to iteratively calculate the elevation value and plane coordinates of the object points corresponding to the image points (Edward M.,2001).

Theoretical Accuracy Analysis
All terms contained in the classical collinear equation (formula (11)) are differentiated and transformed to get formula (13).
In formula ( 13), the calculation of coefficient   can refer to (Zhang Zuxun.,1997).Then, let  =  14 •  25 −  15 •  24 , and equation ( 13) can be further transformed into equation ( 14 And   are calculated as follows：  According to formula ( 14), the plane position precision (mean square error, MSE) of single image object positioning can be calculated based on the error propagation law as formula (15).
Assuming that the UAV takes photos under the ideal condition of small pitch angle, we define the ground auxiliary coordinate system as follows: the flight direction as the  axis, and the inverse direction of the whiskbroom direction as the  axis (as shown in Fig. 1), that is  ≈ 0,  ≈ 0 .Based on this, we analyse the positioning errors theoretically under different whiskbroom angles.
The measurement error of the perspective centre is 6m (POS adopts non-differential GPS position in real-time); the measurement error of IMU is: heading angle 0.02 °, pitch angle 0.01 ° and roll angle 0.01 °.The global ASTER GDEM data of 1:50000 scale is selected in single-ray back-projection, which adopts WGS84 coordinate system, and its grid interval is 1 m and elevation accuracy is 7 m .The image point measurement error is generally 0.5 pixel, and the focal length error is 1 pixel.According to the oblique ULF-SVF-WC system design index in 2.1, the MSE of relevant parameters (M) are shown in Tab. 2.

Basic Parameters
Value The positioning MSE at different flight heights and different whiskbroom angles are calculated based on values in Tab. 2 according to formula (15).
Tab. 3 shows the results of positioning errors at 4000m altitude and different whiskbroom angles, where   is the positioning MSE in the scanning direction,   is the MSE in the flight direction, and   is the plane MSE, the value comparison at different whiskbroom angle is in Fig. 3. Fig. 4     From Tab. 3 and Fig. 3, we have that: (1) under the ideal condition of small pitch, the whiskbroom angle has a greater impact on the positioning accuracy of  direction (whiskbroom direction) than  direction (flight direction) in single image positioning.For example, when the whiskbroom angle is 65°, the MSE in  direction is about 3 times larger than that in  direction.And, with the increase of whiskbroom angle, the error increasing rate in  direction is much greater than that in  direction (as the comparison of green line and red line shown in Fig. 3).The reason is that when imaging with large whiskbroom angle, the intersection angle between light and the ground is so small that make it deviate from the optimal intersection angle, which causes the measurement errors of image points and whiskbroom angles greatly amplified.And this impact is reflected in that the coefficient to calculate   in formula (15) increases sharply with the increase of whiskbroom angle.For example, when the whiskbroom angle is 80°, 9 of the 12 coefficients for calculating   are much larger than those for calculating   , especially 2 are at least 100 times bigger than that for calculating   , and the other 3 are equivalent, which lead to that even very small errors in the whiskbroom direction will be sharply amplified.
(2) In real flight, the pitch and yaw angle are not small angle, this may produce greater plane positioning error than theoretical calculation.And with the increase of whiskbroom angles, the image deformation becomes much larger, which makes image point interpretation and measurement difficult; the error of image point can reach even 1 ~ 2 pixels, which will make the positioning accuracy lower.It can be seen from Fig. 4 that regardless of the image point measurement error increasing caused by the altitude increase, the accuracy of real-time single-image positioning is mainly affected by the whiskbroom angle, and the impact of altitude change on positioning is relatively small; the trend of the blue, green and red lines shows that the positioning errors of different flight heights are almost the same when the whiskbroom angle is less than 60°.However, with the increase of the whiskbroom angle, the larger the flight height at the same angle, the greater the positioning error and the worse the accuracy, which is more obvious when the whiskbroom angle increases especially when the angle is bigger than 60°.It should be noted that we do not consider more error in the above analysis, therefore, the positioning accuracy in actual will be worse than the theoretical analysis.

EXPERIMENT VALIDATION AND RESULTS
In order to verify the theoretical analysis of the real-time singleimage direct positioning accuracy of the oblique ULF-SVF-WC image, we use real oblique ULF-SVF-WC images to do experiments.

Materials and Experimental Area
The experimental area is located in Wei Nan City, Shaan Xi Province, China.When imaging, the UAV flied from northeast to southwest and totally obtained 49 images in 7 whiskbroom strips under the flight height of 4238-4240m and the whiskbroom angle was of 62°-65.5°.The mosaic image of the experimental area and the corresponding field image on Google Map are shown in Fig. 5.

Results Analysis
We take the ASTER GDEM data whose spatial resolution is 20m as auxiliary data in single-ray back-projection positioning calculation.

Actual Accuracy Analysis of Positioning in Object:
Due to the lack of GCPs, six points collected on the Google Map are regarded as check points in this paper, as shown in Fig. 6.The actual coordinates of the 6 check points from Google Map are converted into the ground auxiliary coordinates as the true value to calculate the positioning error and the actual accuracy of single image positioning.The positioning error is shown in Tab. 4, and Fig. 7 shows the error distribution of ground points.From Tab. 4 and Fig. 7, we can get: (1)The single-image positioning error in scanning direction is nearly three times bigger than that in the flight direction, which is close to the distribution result reflected in Tab. 3 when the whiskbroom angle is 60° to 65°, which verifies that the accuracy of flight direction is higher than that of the whiskbroom direction in real-time single-ray back-projection positioning; (2) The actual positioning error is much larger than the theoretical results in Tab. 3, and there is an obvious system deviation in it, which means there is an unsolved system error in the external orientation elements calculated from POS data, which have a significant impact on the experimental results.Therefore, the elimination of this system error must be considered in real-time single-ray back-projection direct positioning.

Relative Accuracy Analysis of Positioning:
We also calculate the reprojection error to analyse the accuracy of singleray back-projection positioning.Firstly, we obtain several corresponding points in the overlapping area of two adjacent images.We calculate the object position (  ,   ) based on the left image by single-ray back-projection method, and get ( ℎ ,  ℎ ) based on the right image through the same method, and take the median of the two results as the final position (  ,   ).Then, we calculate the difference ∆ =   −  ℎ , ∆ =   −  ℎ , and statistic the MSE of ∆(∆) from several points as   (  ) .Thirdly, we use (  ,   ) to calculate the reprojection coordinate on the left image (  ,   ) and on the right image ( ℎ ,  ℎ ) respectively.Finally, we calculate the reprojection error, and statistic the MSE of ∆(∆) from several points as   (  ).
There are two cases in experiments: one is the adjacent images in the same scanning strip, and the other is the adjacent images in different scanning strips; we calculated   (  ) and   (  ) of 1150 points in Tab. 5. From Tab. 5, the single-ray back-projection positioning results differences between two images in the same scanning strip are in meter level, and the minimum value is 0.351m, the reprojection error reaches dozens of pixels; the positioning results differences between two images in the difference scanning strip are in tens of meter level, and the minimum value is 11.445m, the reprojection error reaches more than 100 pixels.This result shows that: (1) Due to the particularity of the oblique ULF-SVF-WC system imaging, the consistency of exterior orientation elements from POS data in the same scanning strips is relatively good and the error of positioning results is relatively small, while the consistency of exterior orientation elements in difference scanning strips is relatively poor and the error of positioning results is large; (2) In different whiskbroom strips, the external orientation elements obtained by POS have significant change in system errors which have negative impacts on single-ray backprojection positioning.Therefore, it is necessary to eliminate this system error change.

CONCLUSION
By analysing the imaging principle and characteristics of ULF-SVF-WC system, this paper established a strict imaging model of the oblique ULF-SVF-WC image, and the model is used for single-ray back-projection object positioning.Based on the imaging mode and single-ray back-projection, the error sources and their impacts on real-time single-ray back-projection positioning and the accuracy of the positioning are analysed theoretically, while the actual accuracy of single-image positioning is analysed through experiments.
Theoretical analysis shows that: (1) Considering the error of the external orientation elements obtained directly from POS data, the real-time single-ray back-projection accuracy can reach 22 m supported by DEM data when the whiskbroom angle is less than 65°; (2) The accuracy of the single-ray back-projection positioning decreases with the increase of the whiskbroom angle, and it decreases rapidly when the angle is greater than 65°.And the increase of flight height can also reduce the positioning accuracy, but the impact is far less than the reduce caused by the increase of whiskbroom angle; (3) The positioning accuracy of whiskbroom direction is always much lower than that of the flight direction.Therefore, the oblique ULF-SVF-WC system should avoid too large whiskbroom angle in imaging, better not more than 65°.
The experiment result of real flight data also verifies the (2) and (3) above, which means the imaging model and the error analysis method are reasonable.Meanwhile, the result shows that the actual accuracy of experiment is lower than that of the theoretical analysis (the actual residual of experiment in flight direction and whiskbroom direction are both more than three times as big as the MSE in theoretical analysis).And there are significant system errors in the positioning results, which indicates that the external orientation elements from POS contains obvious system error and its impact on the positioning result in the whiskbroom direction is also much larger than that in the flight direction.Therefore, this effect should be eliminated in further research.
The research results of this paper are valuable for the preevaluation of single-image positioning accuracy of the oblique ULF-SVF-WC system, and establishes a theoretical foundation for the image geometric processing of this kind of camera.In the future, we will focus on the influence of system error of this kind of system, and study the construction of system error compensation model to improve the accuracy of single-image real-time direct positioning of this kind of camera.

Figure 2 .
Figure 2. Process of solving rotation matrix.
where   ,   = the longitude and latitude of the central point in the experimental area, (  ,  ) (,)=the compensation matrix for the earth's curvature deviation,   = the compensation matrix for meridian deviation,   _ =  (  ,  )(,) •   .The final strict imaging equation is:

Figure 4 .
Figure 4.The Plane Positioning MSE at Different Flight Height.
(a) Experimental Image (b) Satellite Image on Google Map Figure 5. Experimental Area.

Table 1 .
Comparison of performance parameters of several typical aerial cameras

Table 2 .
The MSE of Basic Parameters.
is a comparison diagram of   at different altitudes under various whiskbroom angles.

Table 3 .
The Positioning MSE of Different Flight Heights and Whiskbroom Angles.

Table 5 .
Relative Accuracy of Positioning.