Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci., XL-3/W1, 33-37, 2014
http://www.int-arch-photogramm-remote-sens-spatial-inf-sci.net/XL-3-W1/33/2014/
doi:10.5194/isprsarchives-XL-3-W1-33-2014
© Author(s) 2014. This work is distributed
under the Creative Commons Attribution 3.0 License.
 
05 Mar 2014
POSE VERSUS STATE: ARE SENSOR POSITION AND ATTITUDE SUFFICIENT FOR MODERN PHOTOGRAMMETRY AND REMOTE SENSING?
I. Colomina1 and M. Blázquez2 1Centre Tecnològic de Telecomunicacions de Catalunya, Av. Carl Friedrich Gauss 7, Parc Mediterrani de la Tecnologia, Castelldefels, Spain
2GeoNumerics, Josep Pla 82, Barcelona, Spain
Keywords: Sensor state, sensor orientation, sensor calibration, spatio-temporal sensor calibration, image deblurring, focal-plane shutter camera models Abstract. We investigate the advantages of using what we call sensor state parameters or sensor state to describe the geometrical relationship between a sensor and the 3D space or the 4D time-space that extend the traditional image pose or orientation (position and attitude) to the image state. In our concept, at some point in time t, the sensor state is a 12-dimensional vector composed of four 3-dimensional subvectors p(t), v(t), γ(t) and ω(t). Roughly speaking, p(t) is the sensor’s position, v(t) its linear velocity, γ(t) its attitude and ω(t) its angular velocity. It is clear that the state concept extends the pose or orientation ones and attempts to describe both a sensor’s statics (p(t), γ(t)) and dynamics (v(t), ω(t)). It is also clear that if p(t), γ(t) are known for all t within some time interval of interest, then v(t) and ω(t) can be derived from the former.

We present three methods to compute the state parameters, two for the continuous case and one for the discrete case. The first two methods rely on the availability of inertial measurements and their derived time-Position-Velocity-Attitude (tPVA) trajectories. The first method extends the INS mechanization equations and the second method derives the IMU angular velocities from INS mechanization equations' output data. The third method derives lineal and angular velocities from relative orientation parameters. We illustrate how sensor states can be applied to solve practical problems. For this purpose we have selected three cases: multi-sensor synchronization calibration, correction of image motion blur (translational and rotational) and accurate orientation of images acquired with focal-plane shutters.

Conference paper (PDF, 352 KB)


Citation: Colomina, I. and Blázquez, M.: POSE VERSUS STATE: ARE SENSOR POSITION AND ATTITUDE SUFFICIENT FOR MODERN PHOTOGRAMMETRY AND REMOTE SENSING?, Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci., XL-3/W1, 33-37, doi:10.5194/isprsarchives-XL-3-W1-33-2014, 2014.

BibTeX EndNote Reference Manager XML