ON THE DESIGN OF HIGH RESOLUTION IMAGING SYSTEMS

: High resolution digital cameras for airborne and spaceborne remote sensing are today well established in the market for digital data. Hi-Res here means either spatially or spectrally high-resolution systems. Spectral high-resolution systems have about 100 or more spectral channels with a bandwidth of a few nanometers in the visible spectral range (nnalog in the UV and IR). High spatial resolution systems on aircraft platforms have a GSD from a few centimeters and from space platforms of less than 0.5 meters. The image size (number of pixels) inperpendicular to ﬂight direction is much larger than 20k pixel. The design of high-resolution systems is always a consideration of many parameters. Technological parameter of the imaging system, e.g. diameter of the imaging system, mass and power, as well as storage and data transfer, have an direct impact on spacecraft size and design.


INTRODUCTION
High resolution digital imaging systems for airborne and spaceborne remote sensing are today well established in the market for digital data.Hi-Res means either spatial or spectral highresolution systems.Spectral high-resolution systems have about 100 or more spectral channels with a bandwidth of a few nanometers in the electromagnetic spectrum from UV to IR. High spatial resolution systems on aircraft platforms have a GSD from a few centimeters and from space platforms of better than 0.5 meters.The image size (number of pixels) in or cross to flight direction is typical much larger than 20k pixel.The design of high-resolution systems is always a consideration of many parameters.Technological parameter of the imaging system, e.g.diameter of the imaging system, mass and power, as well as data storage and down-link capability, have an direct impact on spacecraft size and design.
The performance of an imaging system is related to the following parameters: • Signal to noise ration (SNR) • Modulation transfer function (MTF) in different directions and at different places within the image A balance of the modulation transfer function (MTF), and signalto-noise ratio (SNR) is a compromise of the system architecture to optimize the cost, complexity and risk with regard to requirements of end users.Several parameters have influence on the system design and are defined by the customer.
Orbit and pointing.Orbit shape and altitude in relation to revisit times.A typical requirement is pointing by roll rotation, over an angle of at least ±40 • .
Camera instrument concept.Pushbroom scanning or matrix camera is possible.Panchromatic band has a defined GSDP AN (e.g.0.5 m) and some multispectral bands in the visible and near infrared spectrum at resolutions of 4 x GSDP AN .
Optics design.Main requirements concerning MTF and distortions as well as aberrations.* Corresponding author.
Dynamic range.The detector dynamic range is at minimum 12 bits.The system have to provide sufficient sensitivity and dynamic range for global imaging of the earth, from snow and deep clouds to details in shadowed city streets.

MTF.
The system level Modulation Transfer Function (MTF) at the Nyquist frequency is better than 10% (including the optoelectronics system and spacecraft contributions, excluding ground processing) Signal to Noise Ratio.SNR is better than 100 for a surface reflectance of 20%, for all bands for typical illumination conditions.
This paper gives an overview of consequences for other design parameters (e.g.aperture and pixel size) and how to balance radiometric and geometric properties.

BASIC PARAMETERS
For the characterization of image data or derived products, the spatial resolution (GSD) is an important parameter, which is related to the pixel size, focal length and altitude and operations (Deflection of the satellite).However, there are other factors of the imaging system, which affect the image quality.The key parameters are the Point Spread Function (PSF) or modulation transfer function (MTF), and the signal-to-noise ratio (SNR) of the image.They are closely related to subjective parameters such as sharpness and are referred to as performance measures.Metrics to describe the image quality can be derived for each imaging system.They allow a quantitative characterization of the system in terms of image sharpness and additional image properties, such as noise, nonlinearities and artifacts after compression or fusion (e.g.PAN sharpening).In general, the image quality ultimately depends on the problem to be solved and cannot generally be defined for each image.For example for measurement of star positions with a star tracker, the accuracy of the derived position is better if the image is defocused.In difference to an object recognition task, the image should be as sharp as possible.A specific task such as object recognition or detection in a certain area (for example in the detection of an agricultural area) can be solved with a satellite-based sensor, the sensor is optimized to resolve these objects.Resolution requirements lead to sensor parameters such as focal length, aperture and so on.Under these conditions, an ideal sensor can define and simulate the optimal results and the best image quality.These simulation systems have been developed (see Börner (Börner et al., 2001)).

PSF
We assume a linear imaging system with a zero-mean white Gaussian random noise that is additive and uncorrelated to the signal.Consider the 2D-problem.The PSF can be calculated through the input signal U (x, y) and the measurement of V (x, y).If we assume a translation invariant PSF H(x, y) the following equation is valid:

SNR
Signal-to-noise ratio compares the level of a desired signal to the level of background noise.It is defined as the ratio of signal power to the noise power.Noise consists of the Poisson distributed photon and dark noise.The third component is the Gauss distributed read noise.Read noise is defined as the temporal system noise of the detector in darkness and front end electronics.

NIIRS
The National Imagery Interpretability Rating Scale (NIIRS) is a subjective scale used for rating the quality of imagery acquired from various types of imaging systems.It is related to object detection; in particular the scale is a 10-level scale, 0-to-9, defined by interpretation tasks or criteria.GIQE (Leachtenauer et al., 1997) is a heuristic approach for quantifying NIIRS and was developed to provide such predictions.The derivation of the fundamental parameters (RER and SNR) is explained in (Reulke et al., 2015).Gutchess (Gutchess et al., 2011) present an image quality metric and prediction model for SAR imagery.They derived two separate equations for a representation of GIQE, one to predict radar NIIRS and one to predict Automated Target Detection (ATD) performance.

REFERENCE SYSTEM
In the following, we will define a reference system (HRS) that will implement 0.25m resolution at 520 km orbit.For comparison, we add WorldView.WorldView are earth observation satellite operating at an altitude between 500km and 770km.The WorldView telescope has a 110cm aperture.
The F-number (f # ) describes the ratio between focal length f of an optical system and the diameter D of the entrance pupil.
An acceptable f # can be determine from the smearing of a perfect diffraction limited optical system.The detector cut-off frequency of the MTF occurs at ν d = 1/δ.ν d is the spatial frequency in [cycles/mm] in the image plane.The detector-MTF is with a Nyquist frequency of νNy = 1 2•δ .The MTF for a perfect diffraction limited optical system is for ν ≤ νc .
The optical cut-off frequency is νc = 1 λ•f # .With f the focal lenght and diameter D of the entrance pupil the Point-Spread Function is λ gives the wavelength dependence and r 2 = x 2 + y 2 .The first zero is at πD λf • r ≈ 3.8 and the radius of the diffraction disc (Airy disc) is therefore A well-sampled, or diffraction-limited, system is obtain when the optical cut-off frequency is in the range of the detector Nyquist frequency.This system case is diffraction-limited and it corresponds to If the optical cut-off frequency is much larger, aliasing occurs.
If the optical cut-off frequency is much lower then the Nyquist frequency, image quality is much worse because smearing occurs.The diffraction pattern of a point source appears as a bright central disk surrounded by bright and dark rings.The central disk (Airy disk) contains 84% of the total flux.For a diffractionlimited system, image resolution of small targets depends on sensor quality sampling ratio Q. Q is defined as the detector size relative to size of the Airy disk.
The following Figure 1 shows the situation in the Fourier space.The red line is the diffraction limited MTF.In the case of Q=1.7 aliasing can occur.(Optics MTF at Ny is much larger then Pixel MTF).In the case of Q = 2.3, the cut-off frequency of the optical system is equivalent to the detector cut-off frequency.It seems, that an optimal case is in between .For stronger radiometric requirements Q should be below 1.Using the described conditions, we can derive the sensor parameter.
The photons generate with a probability η qu λ electron / hole pairs in the semiconductor.This is the prerequisite for the stored electrons the read-out register.The electron number generated in pixel is given by ξ el , ξ D el is the noise of the (dark) signal electrons which cannot be avoided even in the ideal case.Because of the Poisson distributed photon or electron (shot) noise, the variance (ξ el ) 2 is equal to the averaged electron number, calculated in the following equation (12).
The quantum efficiency η qu λ means that charge carriers are generated under the condition that a photon arrives on the pixel surface.n D el is the mean electron number generated in pixel by dark current and environment (e.g.radiation of instrument parts in infrared).These stored electrons generates at the floating diffusion a voltage The characteristics in equation ( 13) include the three noise-components: photon noise, dark current, and the read-or read-out noise.The last one are noise sources related to the sensor read out and amplifier circuits and can be described by a normal distribution with variance σ 2 k = (ξ k ) 2 .As mentioned above the photon noise and dark current are Poisson distributed.The average voltage is: ηV is the voltage, generated by each electron.The voltage is proportional to the digital gray value, the measured signal s , after A / D conversion.
The ADU noise (quantization noise) ξ ADU will be neglegted in this presentation.The average signal is In this (linear) signal model the total variance σ 2 s of the digital signal s is given according to the propagation of uncertainty (or propagation of error) by With equation ( 16) 17) can be written in the following form: This linear equation is a relation between variance of measured noise and averaged signal.The slope is the system gain Gs = ηDV • ηV and the offset is the read noise η 2 DV • σ 2 k .Both values are spectral independent.The equation is related to the photon transfer method (PTC) and can be use for characterization of the sensor.For explanation see (Janesick, 2007).

Electronics and Band width
In case of high-resolution systems the electronic design and power is driven for NMOS TDI CCD designs by the power consumption and for CMOS TDI CCD by the data rate only.Typical pixel numbers cross to the scanning direction are between 50 and 100 kPixel.The spatial resolution as presented in the chapters before can be go up to 10 cm GSD with a reasonable effort.An orbit of 500 km are requested a ground speed of 7047.3m/s.For the 10cm GSD application synchronous TDI line frequency will be 141kHz.In case of 50kP ixel swath a data rate of 7.05GP ixel/s and for 100kP ixel the data rate will be 14.1GP ixel.This is just the case if the application does not need additional spectral bands, which also have to take into a count in the data rate budget.In case of 8 spectral bands and a factor of four in the GSD the data rate is equivalent to the panchromatic channel.Therefore a maximum data rate of 2 × 14GP ix × 12bit = 336Gbit/s can be reached.That has a couple of consequences on the satellite design.First is that a duty cycle of 10% has a need of 176 TBit storage capacity of one orbit.Typical the storage of 6 orbits shall be foreseen.This will define a new quality of mass storage device with 1 PBit on board.The second consequence is that just fast fiber interfaces are able to link the data interfaces between the detector and the data-handling unit.

Satellite platform
The satellite platform shall be able to compensate the yaw drift of the satellite in order that the TDI technology can work properly.The main influence of the platform design is coming from the allowed (low frequency) jitter and this of course is direct related to detector architecture.The allowed error is a third of a pixel inside the whole TDI scanning line and this is in case of 0.05 m a third of 256 TDI stages.In result of this the jitter requirement is 0.004 arcs/ms.If the detector is splitted for digital readout and combine so analog and digital TDI scanning, the jitter requirement goes down to 0.03 arcs/ms.This based on an assumption that 32 TDI stages are used for analog scanning.By combination of 8 of these analoge TDI the SNR enhancement of square root of 256 is the same in comparison to the single line scan.But the requirement for the jitter is one order of magnitude more relax.

Downlink and on Board Processing
The main challenge of a high-end satellite is the data downlink.Depends on the location of the ground station a typical download rate is per day 4 contacts with 8 min linking time.Means 8 × 60s × 4 are 1920s.In case to link the whole data down (1P bit/1920 = 521GBit/s) a downlink interface of 521 Gbit/s is requested.It is clear that the data transmission via RF would be not applicable just the laser downlink is able to meet the data link capability.A second solution would be to process the data on board and generate information out of these, combine these data with orientation data and link the derived information (instead of the raw data) down.

SUMMERY
In this article, essential requirements for high-resolution sensors were given.The relationship between the most important sensor parameters is described.It is found that for spatial highresolution sensors there is no real limit, except the size of the mirror that can be launched into space.Radiometry, on the other hand, can be improved by increasing the number of TDI stages.The use of the split technology reduces the requirements for the correction of the yaw movement of the satellite.

Table 1 :
Example 1 of a table 4 OPTICS, DETECTOR AND F-NUMBER Spatial resolution (GSD) and detector pitch are mostly the dominant design drivers.With a given flight altitude the focal length can be derived.The relation between pixel size δ and footprint on ground