14.4 Experimental Results. 279. 14.4.1 Experiments with Synthetic Data. 279. 14.4.2 Experiments with Real ... S. Peleg, B. Rousso, A. Rav-Acha, and A. Zomet.
Ryad Benosman Sing Bing Kang Editors
Panoramic Vision Sensors, Theory, and Applications Foreword by Olivier Faugeras With 267 Illustrations
Springer
Contents
Foreword Preface Contributors 1 Introduction R. Benosman and S.B. Kang 1.1 Omnidirectional Vision in Nature 1.2 Man-Made Panoramic Vision 1.3 Organization of Book 1.4 Acknowledgment 2 A Brief Historical Perspective on Panorama R. Benosman and S.B. Kang 2.1 Panorama in the Beginning 2.2 Prom Panorama Exhibits to Photography 2.3 Panorama in Europe and the United States 2.3.1 Panorama in Britain 2.3.2 Panorama in France 2.3.3 Panorama in Germany 2.3.4 Panorama in the United States 2.4 From Panoramic Art to Panoramic Technology 2.4.1 Panoramic Cameras 2.4.2 Omnidirectional Vision Sensors 2.5 The Use of Mirrors in Paintings 2.5.1 The Evolution of Mirrors 2.5.2 Mirrors in Paintings 2.5.3 Anamorphosis 2.6 Concluding Remarks 2.7 Additional Online Resources 2.8 Acknowledgment
v vii xix 1 1 3 3 4 5 5 6 9 9 9 11 12 13 14 14 15 15 16 18 18 19 19
x
Contents
Section I: Catadioptric Panoramic Systems
21
3 Development of Low-Cost Compact Omnidirectional Vision Sensors
23
H. Ishiguro 3.1 Introduction 3.2 Previous Work 3.2.1 Omnidirectional Vision Sensors 3.2.2 Omnidirectional Images 3.3 Designs of ODVSs 3.3.1 Designs of Mirrors 3.3.2 Design of a Supporting Apparatus 3.4 Trial Production of C-ODVSs 3.4.1 Eliminating Internal Reflections 3.4.2 Making Mirrors from Metal 3.4.3 Focusing in an ODVS 3.4.4 Developed C-ODVSs 3.5 Applications of ODVSs 3.5.1 Multimedia Applications 3.5.2 Monitoring Applications 3.5.3 Mobile Robot Navigation 3.6 Conclusion 4
Single Viewpoint Catadioptric Cameras S. Baker and S.K. Nayar 4.1 Introduction 4.2 The Fixed Viewpoint Constraint 4.2.1 Derivation of the Fixed Viewpoint Constraint Equation 4.2.2 General Solution of the Constraint Equation . . . 4.2.3 Specific Solutions of the Constraint Equation . . 4.2.4 The Orthographic Case: Paraboloidal Mirrors . . 4.3 Resolution of a Catadioptric Camera 4.3.1 The Orthographic Case 4.4 Defocus Blur of a Catadioptric Camera 4.4.1 Analysis of Defocus Blur 4.4.2 Defocus Blur in the Orthographic Case . . . . . . 4.4.3 Numerical Results 4.5 Case Study: Parabolic Omnidirectional Cameras 4.5.1 Selection of the Field of View 4.5.2 Implementations of Parabolic Systems 4.6 Conclusion
Contents 5 Epipolar Geometry of Central Panoramic Catadioptric Cameras T. Pajdla, T. Svoboda, and V. Hlavdc 5.1 Introduction 5.2 Terminology and Notation 5.3 Overview of Existing Panoramic Cameras 5.3.1 Stereo and Depth from Panoramic Images . . . . 5.3.2 Classification of Existing Cameras and Comparison of Their Principles 5.4 Central Panoramic Catadioptric Camera 5.5 Camera Model 5.5.1 Hyperbolic Mirror 5.5.2 Parabolic Mirror 5.6 Examples of Real Central Panoramic Catadioptric Cameras 5.7 Epipolar Geometry 5.7.1 Hyperbolic Mirror 5.7.2 Parabolic Mirror 5.8 Estimation of Epipolar Geometry 5.9 Normalization for Estimation of Epipolar Geometry . . . 5.9.1 Normalization for Conventional Cameras 5.9.2 Normalization for Omnidirectional Cameras . . . 5.10 Summary
6 Folded Catadioptric Cameras S.K. Nayar and V. Peri 6.1 Introduction 6.2 Background: Single Mirror Systems 6.3 Geometry of Folded Systems 6.3.1 The General Problem of Folding 6.3.2 The Simpler World of Conies 6.3.3 Equivalent Single Mirror Systems 6.4 Optics of Folded Systems 6.4.1 Pertinent optical effects 6.4.2 Design Parameters 6.4.3 System Optimization 6.5 An Example Implementation
103
Section II: Panoramic Stereo Vision Systems
121
7 A Real-time Panoramic Stereo Imaging System and Its Applications
123
A. Basu and J. Baldwin 7.1 Introduction 7.2 Previous Applications
103 104 105 105 106 108 112 113 114 115 115
123 125
xii
Contents 7.3
Stereo Design 7.3.1 Vertical Extent of Stereo Field of View 7.3.2 Effective Eye Separation 7.3.3 Orientation of Eye Separation 7.4 Device Calibration 7.4.1 Analog Approach 7.4.2 Digital Approach 7.5 Hardware Design and Implementation 7.6 Results Produced by System 7.7 The Mathematics of Panoramic Stereo 7.8 Experimental Results 7.9 Further Improvements 7.10 Acknowledgment
8 Panoramic Imaging with Horizontal Stereo S. Peleg, M. Ben-Ezra, and Y. Pritch 8.1 Introduction 8.1.1 Panoramic Images 8.1.2 Visual Stereo 8.1.3 Caustic Curves 8.2 Multiple Viewpoint Projections 8.3 Stereo Panoramas with Rotating Cameras 8.3.1 Stereo Mosaicing with a Slit Camera 8.3.2 Stereo Mosaicing with a Video Camera 8.4 Stereo Panoramas with a Spiral Mirror 8.5 Stereo Panoramas with a Spiral Lens 8.6 Stereo Pairs from Stereo Panoramas 8.7 Panoramic Stereo Movies 8.8 Left-right Panorama Alignment (Vergence) 8.9 Concluding Remarks 8.10 Acknowledgment
143
9 Panoramic Stereovision Sensor R. Benosman and J. Devars 9.1 Rotating a Linear CCD 9.2 System Function 9.3 Toward a Real-time Sensor? 9.4 Acknowledgment
161
,
10 Calibration of the Stereovision Panoramic Sensor R. Benosman and J. Devars 10.1 Introduction 10.2 Linear Camera Calibration using Rigid Transformation . 10.2.1 The Pinhole Model 10.2.2 Applying the Rigid Transformation
10.2.3 Computing the Calibration Parameters 10.2.4 Reconstruction 10.2.5 Experimental Results Calibrating the Panoramic Sensor using Projective Normalized Vectors 10.3.1 Mathematical Preliminaries 10.3.2 Camera Calibration Handling Lens Distortions Results Conclusion Acknowledgment
xiii 171 172 172 173 173 175 177 178 180 180
11 Matching Linear Stereoscopic Images R. Benosman and J. Devars 11.1 Introduction 11.2 Geometrical Properties of the Panoramic Sensor 11.3 Positioning the Problem 11.4 A Few Notions on Dynamic Programing 11.4.1 Principle 11.4.2 The Family of Dynamic Programming Used . . . 11.5 Matching Linear Lines 11.5.1 Principle 11.5.2 Cost Function 11.5.3 Optimal Path Retrieval and Results 11.5.4 Matching Constraints 11.6 Region Matching 11.6.1 Introduction 11.6.2 Principle of Method 11.6.3 Computing Similarity between Two Intervals . . . 11.6.4 Matching Modulus 11.6.5 Matching Algorithm 11.6.6 Adding Constraints 11.6.7 Experimental Results
181
Section III: Techniques for Generating Panoramic Images
201
12 Characterization of Errors in Compositing Cylindrical Panoramic Images
205
S.B. Kang and R. Weiss 12.1 Introduction 12.1.1 Analyzing the Error in Compositing Length . . . 12.1.2 Camera Calibration 12.1.3 Motivation and Outline 12.2 Generating a Panoramic Image
Contents 12.3 Compositing Errors due to Misestimation of Focal Length 12.3.1 Derivation 12.3.2 Image Compositing Approach to Camera Calibration 12.4 Compositing Errors due to Misestimation of Radial Distortion Coefficient 12.5 Effect of Error in Focal Length and Radial Distortion Coefficient on 3D Data 12.6 An Example using Images of a Real Scene 12.7 Summary
13 Construction of Panoramic Image Mosaics with Global and Local Alignment H.-Y. Shum and R. Szeliski 13.1 Introduction 13.2 Cylindrical and Spherical Panoramas 13.3 Alignment Framework and Motion Models 13.3.1 8-parameter Perspective Transformations 13.3.2 3D Rotations and Zooms 13.3.3 Other Motion Models 13.4 Patch-based Alignment Algorithm 13.4.1 Patch-based Alignment 13.4.2 Correlation-style Search 13.5 Estimating the Focal Length 13.5.1 Closing the Gap in a Panorama 13.6 Global Alignment (Block Adjustment) 13.6.1 Establishing the Point Correspondences 13.6.2 Optimally Criteria 13.6.3 Solution Technique 13.6.4 Optimizing in Screen Coordinates 13.7 Deghosting (Local Alignment) 13.8 Experiments 13.8.1 Global Alignment 13.8.2 Local Alignment 13.8.3 Additional Examples 13.9 Environment Map Construction 13.10 Discussion 13.11 Appendix: Linearly-constrained Least-squares 13.11.1 Lagrange Multipliers 13.11.2 Elimination Method 13.11.3 QR Factorization
Contents 14 Self-Calibration of Zooming Cameras from a Single Viewpoint L. de Agapito, E. Hayman, I.D. Reid, and R.I. Hartley 14.1 Introduction 14.2 The Rotating Camera 14.2.1 Camera Model 14.2.2 The Inter-image Homography 14.2.3 The Infinite Homography Constraint 14.3 Self-calibration of Rotating Cameras 14.3.1 Problem Formulation 14.3.2 Constant Intrinsic Parameters 14.3.3 Varying Intrinsic Parameters 14.4 Experimental Results 14.4.1 Experiments with Synthetic Data 14.4.2 Experiments with Real Data 14.5 Optimal Estimation: Bundle-adjustment 14.5.1 Maximum Likelihood Estimation (MLE) 14.5.2 Using Priors on the Estimated Parameters: Maximum a Posteriori Estimation (MAP) . . . . 14.5.3 Experimental Results 14.6 Discussion
15 360 x 360 Mosaics: Regular and Stereoscopic S.K. Nayar and A.D. Karmarkar 15.1 Spherical Mosaics 15.2 360° Strips 15.3 360° Slices 15.4 Slice Cameras 15.5 Experimental Results 15.6 Variants of the Slice Camera . . . 15.7 Summary
291
16 Mosaicing with Strips on Adaptive Manifolds S. Peleg, B. Rousso, A. Rav-Acha, and A. Zomet 16.1 Introduction 16.2 Mosaicing with Strips 16.3 Cutting and Pasting of Strips 16.3.1 Selecting Strips 16.3.2 Pasting Strips 16.4 Examples of Mosaicing Implementations 16.4.1 Strip Cut and Paste 16.4.2 Color Merging in Seams 16.4.3 Mosaicing with Straight Strips 16.4.4 Mosaicing with Curved Strips: Forward Motion . 16.5 Rectified Mosaicing: A Tilted Camera
17 3D Environment Modeling from Multiple Cylindrical Panoramic Images
329
S.B. 17.1 17.2 17.3 17.4 17.5
Kang and R. Szeliski Introduction Relevant Work Overview of Approach Extraction of Panoramic Images Recovery of Epipolar Geometry 17.5.1 8-point Algorithm: Basics 17.5.2 Tracking Features for 8-point Algorithm 17.6 Omnidirectional Multibaseline Stereo 17.6.1 Reconstruction Method 1: Unconstrained Feature Tracking and 3D Data Merging 17.6.2 Reconstruction Method 2: Iterative Panoramic Structure from Motion 17.6.3 Reconstruction method 3: Constrained Depth Recovery using Epipolar Geometry 17.7 Stereo Data Segmentation and Modeling 17.8 Experimental Results 17.8.1 Synthetic Scene 17.8.2 Real Scenes 17.9 Discussion and Conclusions 17.10 Appendix: Optimal Point Intersection 17.11 Appendix: Elemental Transform Derivatives 18 N-Ocular Stereo for Real-Time Human Tracking T. Sogo, H. Ishiguro, and M.M. Trivedi 18.1 Introduction 18.2 Multiple Camera Stereo 18.2.1 The Correspondence Problems and Trinocular Stereo 18.2.2 Problems of Previous Methods 18.3 Localization of Targets by N-ocular Stereo 18.3.1 Basic Algorithm 18.3.2 Localization of Targets and Error Handling . . . 18.3.3 False Matchings in N-ocular Stereo 18.4 Implementing N-ocular Stereo 18.4.1 Simplified N-ocular Stereo
18.4.2 Error Handling in the Simplified N-ocular Stereo 18.5 Experimentation 18.5.1 Hardware Configuration 18.5.2 Detecting Azimuth Angles of Targets 18.5.3 Precision of N-ocular Stereo 18.5.4 Tracking People 18.5.5 Application of the System 18.6 Conclusion
367 369 369 369 370 372 373 374
19 Identifying and Localizing Robots with Omnidirectional Vision Sensors H. Ishiguro, K. Kato, and M. Barth 19.1 Introduction 19.2 Omnidirectional Vision Sensor 19.3 Identification and Localization Algorithm 19.3.1 Methodology 19.3.2 Triangle Constraint 19.3.3 Triangle Verification 19.3.4 Error Handling 19.3.5 Computational Cost 19.4 Experimental Results 19.4.1 Simulation Experiments 19.4.2 Real-world Experiment 19.5 Conclusions
20 Video Representation and Manipulations Using Mosaics 393 P. Anandan and M. Irani 20.1 Introduction 20.2 From Frames to Scenes 20.2.1 The Extended Spatial Information: The Panoramic Mosaic Image 20.2.2 The Geometric Information 20.2.3 The Dynamic Information 20.3 Uses of the Scene-based Representation 20.3.1 Visual Summaries: A Visual Table of Content . . 20.3.2 Mosaic-based Video Indexing and Annotation . . 20.3.3 Mosaic-based Video Enhancement 20.3.4 Mosaic-based Video Compression 20.4 Building the Scene-based Representation 20.4.1 Estimating the Geometric Transformations . . . . 20.4.2 Sequence Alignment and Integration 20.4.3 Moving Object Detection and Tracking 20.5 Conclusion