Even though panoramic images have been used extensively for remote sensing applications such as forest pest dam- age detection and regional planning, they ...
MODELING PANORAMIC LINEAR ARRAY SCANNER
Ayman Habib, Belay T. Beshah Department of Civil and Environmental Engineering and Geodetic Science The Ohio State University 470 Hitchcock Hall, 2070 Neil Avenue, Columbus OH 43210-1226 Kevin Reece Omni Solutions International, Ltd. 8614 Westwood Center Dr. , Suite 700, Vienna, VA 22182 Commission III, Working Group 1
KEY WORDS: Panoramic Sensor, Aerial Triangulation, Bundle Adjustment ABSTRACT
Panoramic cameras, which seek to combine high resolution and wide swath in one camera, have been used extensively for dierent photographic interpretation purposes. However, the application of panoramic cameras to photogrammetric measurement has been hampered by the fact that the calibration of such cameras is dicult and the images re ect comparatively complicated geometry. Moreover, the distortion inherent in the images makes it dicult to readily analyze them using the currently available Automatic Aerial Triangulation or even manual measurements with bundle adjustment. The main objective of this research has been to use panoramic linear array scanner data for the purpose of point positioning. The collinearity equations are extended to accommodate the panoramic linear array sensor. An algorithm capable of integrating position and attitude support data (GPS/INS) into the extended bundle adjustment is developed. Extensive simulation studies were performed in order to examine the positional accuracy, which could be obtained, and to study the correlation among some of the interior and exterior orientation parameters.
1 INTRODUCTION The panoramic camera is a type of camera that scans the terrain of interest from side to side normal to the direction of
ight. Compared to frame cameras, panoramic cameras cover a much larger ground area. With their narrow lens eld of view panoramic cameras produce images with greater detail than frame images. However, panoramic images lack the geometric delity of frame camera images and have greatly varying atmospheric eects in dierent portion of the image that rises from the dierence in the distance to the ground [Lillesand and Kiefer, 1996]. Even though panoramic images have been used extensively for remote sensing applications such as forest pest damage detection and regional planning, they have not been used that much for geodetic measurements. The use of panoramic cameras in photogrammetric measurements has been restrained partly because of the complicated geometry of the images and partly because of the diculty encountered in the calibration and image coordinate measurements [ASPRS, 1980, Lillesand and Kiefer, 1996]. The Panoramic Linear Array scanner is a digital system that provides a digital output with its line CCD sensor that is advantageous for real time processing. In this paper, the mathematical model of the panoramic linear array scanner together with the integration of GPS/INS into an extended bundle block adjustment is described. In section three the results obtained from simulations performed to asses the obtainable accuracy and to understand the correlation between the dierent parameters are discussed. Finally some concluding remarks and ideas for future developments are presented. Funding for this research was provided by the Center for Mapping, The Ohio State University, under NASA contract NAG 13-42.
Figure 1: Image formation in Panoramic Line Array Scanner
2 MATHEMATICAL MODEL To reconstruct the position and shape of objects from photographs, we must know the geometry at the instant when the photographic images were formed. The geometric conditions of a panoramic sensor are quite similar to those of other push broom scanners such as SPOT and three line scanners like MOMS. In the panoramic linear array scanner, the sensor in the focal plane is parallel to the ight direction. Ground area coverage is obtained by rotating the telescope containing the imaging sensor in the cross track direction. The perspective geometry is valid only at a speci c projection time and diers as the angle of swing ( ) changes (Figure 1). To compensate for aircraft motion and to have a rectangular foot print, the lens is moved with respect to the camera scanning position. This motion will be referred to as the Image motion compensation (Imc).
Figure 2: Coordinate systems The Panoramic Line Array Scanner (DDPTM ) modeled in this research has been developed by OMNI Solutions International, Ltd. Even though the number of sensors and the image size depicted in this research are speci c to this scanner all other parts of the modeling are not restricted.
2.1 Coordinate Systems The image formation process in the panoramic linear array scanner is comprised of a number of CCD line sensors mounted in the focal plane of the optics. Three coordinate systems are de ned which relate the image points to the conjugate object points in the ground as shown in Figure 2 [OMNI, 1996]. 1. Telescope: fx; y; z gT is a coordinate system which has its origin at the perspective center of the lens when the lens is in the nadir position. 2. Camera: fx; y; z gC is a coordinate system which is de ned by x being in the ight direction with z being up. The scan arm rotates about the x-axis with its angular measure being measured from the y to zaxis. The origin, fx; y; z gC = fx; y; z gT , is at the perspective center of the lens when the lens is at the nadir position. An equal to zero corresponds to the positive z-axis and occurs at the nadir point. 3. Ground: fX; Y; Z gG is a user de ned coordinate system (local, state, or UTM etc. ). The relationship between the camera and the ground coordinate system is de ned through the rotation angles ! , and .
2.2 Collinearity Equation The image to ground coordinate relationship is established through the collinearity model that expresses the condition that the perspective center, the image point and the object point must all lie on a straight line. This gives the basic geometric relationship for the single photogrammetric ray. In other words, the two vectors, the vector from the perspective center to the image point and the vector from the perspective center to the object point, are collinear. Applying the collinearity model to the image point formation, we can express the rst vector from the focal plane to the perspective center PC in the local telescope coordinate system by (Figure 3):
Figure 3: The vector from the perspective center to the image point
xt ? Imc(t) ? xp yy ? yp ?f
The relationship between the Camera and Telescope coordinate systems, as shown in Figure 2 could be expressed by the following rotation matrix and could be used to transform the vector Vt to the camera coordinate system:
1 0 0
t ) ? sin(t) V~t sin(t ) cos(t )
Where t is the swing angle at time t The second vector from the perspective center to the object point, in the ground coordinate system, can be expressed by Equation 3.
XG ? XOt YG ? YOt ZG ? ZOt
The two vectors V~c de ned with respect to the camera coordinate system and de ned V~G with respect to the ground coordinate system are related through the rotation matrix R(! , , ). Then the collinearity equations for each scan line at time t can be written as: "
XG ? XOt YG ? YOt ZG ? ZOt
R(!t ; t; t )V~c = R(!t ; t; t )R(t)V~t
(4) Where is the scale. Rearranging the above equation to have the telescope coordinate measurements as observations, the modi ed collinearity equations are given by [Habib and Beshah, 1997]: xt
xp ? f
xp ? f
r11t (XG ? XOt ) + r12t (YG ? YOt ) + r13t (ZG ? ZOt ) r31t (XG ? XOt ) + r32t (YG ? YOt ) + r33t (ZG ? ZOt )
r21t (XG ? XOt ) + r22t (YG ? YOt ) + r23t (ZG ? ZOt ) r31t (XG ? XOt ) + r32t (YG ? YOt ) + r33t (ZG ? ZOt )
where t xt and yt
XG ; YG and ZG xp ; yp and f r11t ; r12t : : : r33t Imc(t)
XOt ; YOt and ZOt
time epoch for each scan line are image coordinate measurement w.r.t the telescope coordinate system object coordinates calibrated principal point position and focal length of the camera elements of the rotation matrix Rt = R(t )R(!t ; t ; t ) at time t image motion compensation at time t object coordinate of the perspective center at time t
Each row in the nal scene is captured at a dierent epoch. Each of these rows has it's own Exterior Orientation Parameters (EOP). For example, for OMNI camera that produces (32K x 6K) imagery one has to consider 196,608 unknown EOPs' for the 32K rows. Not only are these parameters too many, they are also highly correlated [Ebner et al., 1996, Ebner et al., 1991, Heipke et al., 1996, Fraser et al., 1996]. Modeling the system's trajectory [Ebner et al., 1996] or Orientation Images (OI) [Ebner et al., 1996, Ebner et al., 1991, Heipke et al., 1996] could be used to carry out the reduction of these (too many) parameters. In this research we have opted for the concept of using OI for a number of scan lines for the following reasons:
For aerial ight, the trajectory can be too rough to be modeled by a polynomial
With OI, it is easier to incorporate GPS/INS observations directly. These observations can be considered as prior information for the EOP of the OI's.
The swing angle (t ) and the Image motion compensation (Imc), are described by polynomials that express them in terms of time. In the current implementation of DDPTM camera from OMNI, the Imc and the scan angle are tied together using hardware. Therefore, (t) and Imc(t) come directly from the encoders that are available as support data sampled at 2.5kHz [OMNI, 1996]. Using this data the initial approximation for the coecients of these polynomials can be directly calculated by tting a polynomial. On the other hand, the moment of exposure (t) for any point could be calculated by as the row of the point versus the total number of scan lines times the total scan time. The unknowns of the adjustment are the coecients used to represent the angle of swing and the coecients of Imc, three times the number of ground coordinate points and six times the number of orientation images. The image coordinates are considered as observations for a least-square adjustment together with other additional observations: control points, GPS/INS, and any other prior information about the remaining parameters. The number of OI used in the interpolation of exterior orientation parameters for each scan line depends on the interpolation function used. The order of the interpolation polynomial is de ned by the user. Thus, the exterior orientation parameters at any epoch are computed through the weighted sum of the EOP associated with the neighboring OI.
Figure 4: GPS antenna position
2.3 GPS/INS Observations GPS Observation The GPS observations from the receiver
onboard the plane are brought into the adjustment by additional observation equations, which relate them directly to the unknown perspective center coordinates of the orientation images. The GPS observation, after correction for the time oset, still pertains to the GPS phase center of the antenna (Figure 4). Where as we are interested on the position of the perspective center of the camera at that time of exposure, which has a spatial oset of (dx, dy, dz) [Ackermann and Schade, 1993]. The relationship between the observed GPS coordinates and the unknown ground coordinates of the perspective center can be expressed as follows (Equation 6):
XGP St YGP St ZGPSt
XOt YOt ZOt exGP S eyGP S ezGP S
R(!t ; t ; t )
dx dy dz
exGP S eyGP S ezGP S
If the spatial oset (dx, dy, dz) from the perspective center of the camera to the antenna has already been established from prior calibration, then we can incorporate this as an additional observation. The weight associated with the observation depends on the accuracy of the calibration process.
The computed attitude from the Inertial Navigation System (INS) could directly be incorporated to the extended bundle adjustment as shown in Equation 7. h
!t t t
!t t t e! e e
i h +
e! e e
Simulations based on the camera speci cations and typical
ying parameters were performed to obtain an estimate of the attainable geometric accuracy, test the models discussed and to provide recommendations for the planning phase of the project. The simulations are performed with the following assumptions and given parameters:
Thus, the cross strip scenario with common assumption can not decorrelate the attitude angles from the scan angle . On the other hand, we implemented a free network adjustment by incorporating the pseudo inverse of the normal equation matrix which led to acceptable results regardless of the initial singularity (notice the root mean square error of the check points in case 3 and 4).
4 CONCLUSION AND REMARKS
Figure 5: Cross strip ight 1. Digital Elevation Model of the area 2. The interior orientation parameters of the camera (xp ; yp ; f ) 3. Polynomial coecients that describe the change in the scan angle (t ) the image motion compensation Imc(t), and the exterior orientation parameters XO ; YO ; ZO ; !; ; with time.
3 EXPERIMENTS AND RESULTS An extended bundle adjustment program was developed using Visual C++ according to the model discussed in the last section. In this section, results that were obtained by the program in dierent situations by: xing or releasing some of the parameters, with/without GPS observations and by using dierent number of control points are presented. The results summarized in Table 1 illustrate the singularities found in the system as indicated by extremely small eigenvalues (1E-12 being the bench mark based on the signi cant digits of the system) under dierent scenarios. Because of the high correlation between the and ! rotation angles, we have a singularity in case 3. This problem can be overcome by either xing the angle or the ! rotation angle. Case 1 of Table 2 con rms the fact that no singularitiesshould be expected when performing GPS controlled aerial triangulation for blocks of imagery in the absence of any ground control point. Notice the large improvement in the RMS values by introducing ground control points when comparing the results of cases 1 and 2 in Table 2. In the next experiment(Table 3), the possibility of decorrelating the rotation angles (! and ) from the scan angle is studied. An attempt to do that is performed by simulating a ight mission with 8 panoramic images taken in cross strips(Figure 5) and constraining the scan angle to be global for all the images. By comparing the RMS values of case 1 in Table 2 with case 1 in Table 3, one can see the advantage of having cross strip con guration over the normal parallel strips. Examining the eigenvalues associated with case 3 in Table 3 (1.09E-12 and 7.22E-16) and for case 4 (8.27E-13 and 2.03E-15), we nd two extremely small eigenvalues that correspond to the coef cients describing the change in the scan angle with time.
The purpose of this study was to investigate the possibility of modeling the perspective geometry of panoramic linear array scanners. The results obtained from the simulated data are encouraging. These results ensure the capability of using such a system for medium to large scale mapping. The study also indicated that the scan angle is correlated with the rotation angle across the ight direction. This implies that prior information about one of these angles is necessary for the adjustment. The prior information could be obtained either from INS for the rotation angle or from a scan encoder which records the scan angle in the panoramic sensor. On the other hand, cross strip con guration coupled with free-network adjustment can also solve the problem. Preliminary studies conducted on real test eld data with coarse calibration parameters con rm the capability of using such camera systems for the production of 1:24000 maps. We will publish the result of this current study at a later date. Much research work still needs to be done in connection with the Panoramic Linear Array Scanner, some recommendations for future investigation are:
Veri cation of the predicted theoretical accuracy by implementing real data. Add calibration capability to the system. Generate modules for DEM and Ortho-photo generation. Automatic selection of triangulation points from the imagery (e.g. through interest point operators) to yield an Automatic Aerial Triangulation System.
[Ackermann and Schade, 1993] Ackermann, F., Schade,H., 1993. Application of GPS for Aerial Triangulation. Photogrammetry and Remote Sensing, Vol. 59, No. 11, November 1993, pp. 1625-1632. [ASPRS, 1980] ASPRS, 1980. Manual of Photogrammetry. Fourth Edition. [Ebner et al., 1996] Ebner, H., Ohlhof, T., Putz, E., 1996. Orientation of MOMS-02/D2 and MOMS-2P Imagery. International Archives of Photogrammetry and Remote Sensing, Commission III, Vol. XXXI, Part B3, pp. 158-164. [Ebner et al., 1991] Ebner, H., Kornus, W., Strunz, G., 1991. A Simulation Study on Point Determination Using MOMS-02/D2 Imagery. Photogrammetry and Remote Sensing, Vol. 57, No. 10, October 1991, pp. 1315-1320. [Fraser et al., 1996] Fraser, C. S., Shao, J., 1996. On the Triangulation Accuracy of MOMS-02 Three Line Imagery. Geomatics Research Australasia, pp. 47-64.
Table 1: 2 strip x 2 photos Case Description Fixed Free No. Parameters Parameters 1 No Control XO ; YO ; ZO ! , , Points xp ; yp ; c
Min Eigen Value 2.85E-9
6 Control Points
6 Control Points
Case Description No. 1 2
GPS given No INS No Control Points GPS given No INS 6 Control Points
Case Description No. 1
GPS given No INS No Control Points given GPS given No INS 10 Control Points given Common GPS given No INS 10 Control Points Unknown Common GPS given No INS No Control Points Unknown Common
X O ; YO ; Z O xp ; yp ; c X O ; YO ; Z O xp ; yp ; c
Table 2: 2 strip x 4 photos Fixed Free Parameters Parameters
XO ; YO ; ZO ( =0.04m) xp ; yp ; c ( =0.005mm) ( =7") IMC ( =0.005mm) XO ; YO ; ZO ( =0.04m) xp ; yp ; c ( =0.005mm) ( =7") IMC ( =0.005mm)
Rms Rms Rms X Y Z (m) (m) (m) 4.65 4.75 6.61
Table 3: 8 images in cross strips Fixed Free Parameters Parameters
XO ; YO ; ZO ( =0.04m) xp ; yp ; c ( =0.005mm) IMC ( =0.005mm) ( =7")
Rms Rms Rms X Y Z (m) (m) (m) 0.40 0.35 0.89
XO ; YO ; ZO ( =0.04m) xp ; yp ; c ( =0.005mm) IMC ( =0.005mm) ( =7")
XO ; YO ; ZO ( =0.04m) xp ; yp ; c ( =0.005mm) IMC ( =0.005mm)
XO ; YO ; ZO ( =0.04m) xp ; yp ; c ( =0.005mm) IMC ( =0.005mm) ( =7")
[Habib and Beshah, 1997] Habib, A., Beshah, B.T., 1997. Modeling Panoramic Linear Array Scanner. Departmental Report 443, Civil and Environmental Engineering and Geodetic Science, The Ohio State University, Columbus, Ohio. [Heipke et al., 1996] Heipke, C., Kornus, W., Pfannnenstein, A., 1996. The Evaluation of MEOSS Airborne 3-Line Scanner Imagery-Processing Chain and Results. Photogrammetry and Remote Sensing, Vol. 62, No. 3, pp. 293-299. [Lillesand and Kiefer, 1996] Lillesand, T. M., Kiefer R.W, 1994. Remote Sensing and Image Interpretation. Third Edition, John Wiley and Sons, New York. [OMNI, 1996] OMNI, 1996, Technical Memo, OMNI Solution International Ltd., October 24, 1996, Vienna, VA.