SURVEY OF ADVANCED TECHNOLOGIES IN IMAGING SCIENCE FOR REMOTE SENSING

By:

Erich Hernandez-Baquero

B.S. Physics, U.S. Air Force Academy

Rochester Institute of Technology

(1997)

Table of Contents

Table of Contents ii

List of Figures and Tables iii

Abstract iv

1.0 Introduction 1

1.1 Applications 1

1.1.1 Environmental Science 1

1.1.2 Military 3

1.1.3 Commercial 3

1.2 The Imaging Chain 4

2.0 Approach 6

2.1 Field of View 6

2.2 Data Transfer and Storage 6

2.3 Quality of Imagery 7

2.3.1 Atmospheric Distortions 7

2.3.2 Sensor Performance 8

2.3.3 Platform-Induced Distortions 10

2.3.4 Image Processing Algorithms 10

3.0 Results 11

3.1 Multispectral Imaging 11

3.2 Hyperspectral Imaging 13

3.3 Synthetic Aperture Radar 17

4.0 Conclusions 20

Appendix 22

References 26

List of Figures and Tables

Figure 1. View of El Niño from the TOPEX/Poseidon satellite. 2

Figure 2. Comparison of CORONA and SPOT satellite imagery. 4

Figure 3. Atmospheric correction of Landsat-TM data by image processing. 10

Figure 4. MSS scanning geometry and image projection 12

Figure 5. Typical hyperspectral image cube 14

Figure 6. 3-D topographical SAR imagery 19

Table 1. UVISI sensor suite parameters. 16

Table 2. Spaceborne Hyperspectral Imagers 22

Table 3. Airborne Hyperspectral Imagers 23

Abstract

Various technologies in imaging science used for remote sensing are described. Although this report is not all-inclusive, the technologies presented are diverse and represent the most prominent fields in remote sensing imaging. Strengths and weaknesses are evaluated as it pertains to specific applications (either airborne or spaceborne). A brief description of the theory of each technique is also provided. A vision for the future of remote sensing is provided.

16

SURVEY OF ADVANCED TECHONOLOGIES IN IMAGING SCIENCE FOR

REMOTE SENSING

1.0  Introduction

Remote sensing is a natural extension of the human need to explore and understand its environment. Through advances in technology, mankind has been able to extend the way we see the world to a perspective never before possible. Using airborne and spaceborne platforms, complex imaging systems that surpass the limitations of the human eye are used to observe the Earth. Through these systems, we can now see in spectral regions that were previously invisible to the unaided eye.

The ability to extract information about our world and present it in ways that our visual perception can comprehend is the ultimate goal of imaging science in remote sensing. In all applications--from environmental monitoring to intelligence data gathering--the need to obtain more accurate information in a timely and efficient manner continues to grow exponentially. It is precisely because of this rapid growth that a broad range of technologies is presented in this report.

1.1 Applications

1.1.1 Environmental Science

Clearly one of the largest and most prominent applications is the study of the Earth’s ecosystem through the use of remote sensing. The synoptic view obtained from airborne and spaceborne imaging platforms provides an opportunity to understand weather systems, climate changes, geological phenomena, etc. from a global perspective. Not only are we able to view the Earth as a single ecosystem, but the amount and quality
of information that we can gather is much greater than other methods of observation.

Figure 1. View of El Niño from the TOPEX/Poseidon satellite.

Figure 1 is an example of the kind of imagery that is available to anyone almost instantaneously over the Internet. The scale describes the height of the ocean surface (which is directly correlated to temperature) compared to last year’s measurements. El Niño is seen as a mass of “red water” accumulating along the Eastern Pacific. The data contained in this single image from space would have required the use of hundreds of boats and instruments and could have potentially taken a considerable amount of time to process and distribute1.

1.1.2  Military

Perhaps one of the areas in which the greatest advances in imaging technology have occurred is in the field of intelligence data gathering for the support of military operations and national security. The need for accurate and timely data cannot be overemphasized here since the lives of military personnel could be saved by having a better understanding of the enemy forces location and activities. In addition, international treaties involving nuclear disarmament and biological/chemical warfare can be enforced without actually having to send in a team of inspectors. High-flying aircraft such as the SR-71 and U-2 and satellite platforms such as the recently de-classified CORONA provide this type of information. The resolution available from these systems is far greater than their civilian counterparts. The CORONA satellite, for example, could obtain images with resolutions of approximately 6 feet! This technology, although dating to the 1960’s, is still better than most currently operating civilian/commercial spaceborne imaging systems such as the French SPOT (see Figure 2).

1.1.3.  Commercial

During the 1980’s, the federal government decided that private industry should operate satellite space systems and manage the data that is generated from these systems. As a result of that, many companies began to sponsor or even develop the capability to do remote sensing. Their customers were the scientific community and the government itself. Other customers included local utility companies that would provide their customers with information about their energy use. For example, through the use of thermal infrared sensors, information about how efficient a home or a building is using its electricity can be determined3. Many of the commercial imagery are available for sale
over the Internet--thus making it very accessible to the public.

Figure 2. Comparison of CORONA and SPOT satellite imagery2.

1.2 The Imaging Chain

Before we can start analyzing and comparing remote sensing systems, we must first lay an approach for visualizing these imaging systems. At first glance, one might consider that the caliber and performance of an imaging system rests solely on the quality of the optical system in terms of resolution and accuracy. However, the fact is that in order to fully characterize a system we must look at it using an end-to-end perspective. A satellite, for example, could be equipped with the highest technology hardware available, but if the images generated by that system cannot be processed (or interpreted) then it is useless. We then look at systems from an imaging chain approach.

The imaging chain simply consists of all the steps (which can be thought of as rungs in a chain) required to bring an image to an end-user. At this point it is important to note that the end-user may not only be a human looking at a picture or movie, but it may also be a control system used in an automated process. The imaging chain takes us through the steps of capturing a scene, storing, manipulating, and transmitting the data, and finally displaying the image. Clearly, a system may generate good data that can be processed to yield accurate information, but without a good system to display it the whole process suffers. The analogy to a chain applies here as we think of the whole chain being only as strong as its weakest link3.

The scope of this report is to analyze emerging technologies in imaging science for remote sensing using the imaging chain approach. However, no discussion on display systems is provided. It is assumed that the systems presented in this report can generate processed imagery that can be properly digitized and displayed on a moderate resolution CRT or incorporated into automated systems. Thus, using the imaging chain approach, a system can be evaluated by the scene that it can capture, the data transfer and storage capability, and the quality of the produced imagery.

2.0 Approach

The following parameters will be used to evaluate the performance of the imaging systems presented in this report:

2.1 Field of View

The scene a system can capture is driven mainly by its Field Of View (FOV). In particular, we are interested in an imaging sensor’s instantaneous FOV (IFOV), the ground IFOV (GIFOV), and the height of the imaging sensor platform. The relationship is given by

GFOV = H∙IFOV [1]

where H is the height of the platform, and IFOV is the size of the sensor at the imaging plane divided by the effective focal length of the optical system. Clearly, how much total ground coverage is achieved depends on the GFOV (which depends on the orbital parameters of a satellite platform or the flying altitude of an aircraft) and the dwell time on a particular scene. Depending on the sensor configuration, the total FOV may range from only 15o to 120o 3,4. Unfortunately, a larger FOV is not necessarily the best solution since it is more susceptible to geometric distortions and often results in poor spatial resolution. These parameters continue to improve as new electro-optics technologies develop.

2.2  Data Transfer and Storage

The data transfer and storage capability of a system depends on the electronic configuration at the imaging platform. Ultimately, the photons generated or reflected by a scene reach the sensor and the energy is turned into an electrical signal (or recorded on film). Since emerging technologies involve the use of electro-optical imaging sensors, we will look at the requirements for storing and transferring the electrical data. Because of weight limitations, satellite systems usually send the data in real-time or near real-time via telemetry to ground stations which then record the data in optical drives, CD’s, or serial tape drives. Airborne platforms, however, may be able contain the data storage hardware onboard. In many cases, the distribution and storage of data is handled by government organizations or by private industry. This is another area where technology continues to improve and become more affordable allowing real-time delivery of large volumes of imagery data with minimal loss or distortion of data.

2.3 Quality of Imagery

The quality of the imagery mainly depends on the atmospheric distortions, sensor performance, platform-induced distortions, and the effectiveness of image processing algorithms. Of all of these, the atmosphere is the most dynamic, and consequently, the most difficult source of image degradation to compensate for.

2.3.1 Atmospheric Distortions

Slight variations in the atmosphere change the effective index of refraction of the atmospheric medium for any given optical path between the scene and the sensor. The effect of the atmosphere is typically seen as a blurring and loss of contrast in an image. When looking at spectral data, the atmosphere affects the spectral profile that a sensor “sees” by blocking or introducing different frequency bands to the spectra, thus generating inaccuracies in the image segmentation and classification process. In the extreme case, heavy cloud cover may completely obscure a scene from a remote sensing system. The degradation process is difficult to characterize because of the large amounts of physical processes occurring in the atmosphere that affect the transmission of light through it. In general, atmospheric effects are compensated for through a complex model of the atmosphere. The U.S. Air Force Phillips Laboratory Geophysics Directorate has developed a widely accepted database of atmospheric constituents allowing the user to make an estimation of the atmospheric effects on the image acquisition process. Other approaches for atmospheric compensation include speckle imaging, range-angle interferometry, and adaptive optics5. Interestingly, many of these developments in atmospheric compensation originated from within the astronomical community.

2.3.2  Sensor Performance

There are so many aspects of sensor design and operation that contribute to a sensor’s overall performance that it would be beyond the scope of this paper to discuss all of them. A more complete treatment of sensor design and performance is found in Chapter 8 of the Manual of Remote Sensing6 and Volumes 3 and 4 of The Infrared & Electro-Optical Systems Handbook7. For our discussion, it is sufficient to mention that the sensor performance is mainly driven by its signal-to-noise ratio (SNR), spectral response, throughput, and ease of calibration. The SNR is simply how well the sensor can distinguish a signal of interest from the electronic or thermal noise associated with the hardware. This is the parameter that requires longer dwell time in order to obtain enough photons or flux to create a signal above the noise of the sensor. Although the detector usually drives the noise, it is possible to have a system where the signal conditioning electronics are the major source of noise. The spectral response is a parameter that describes how well the sensor can “see” in a specific spectral band. If the optical wavelength we are interested in falls in the near-infrared, for example, and our sensor has no spectral response in this region then it is not going to register a signal. Another example is the spectral response of the eye, which can only see in the visible optical spectrum. The throughput is a measure of how well the incoming flux of radiation propagates through the sensor optics. Clearly, poor mirror coatings and lens aberrations will cause the light to be scattered or attenuated as it propagates through the sensor system, thus limiting the number of photons reaching the detector. Finally, no sensor can provide high-quality imagery without proper calibration. A well-calibrated system increases the confidence in the accuracy of the data. Temperature readings from an airborne radiometer may very well be useless if the data cannot be compared to some absolute measurement. Calibration is a far greater issue for spaceborne sensors. A system can be well calibrated in the laboratory but once it reaches space, the zero-gravity environment can change the hardware to a point where the instrument operates differently than it did on the ground. Advances in remote calibration techniques include the use of ground-based radar and lasers8.

2.3.3 Platform-Induced Distortions



Another source of distortions on an image is the imaging platform itself. This is particularly true of non-stabilized platforms such as aircraft. As the airplane pitches, yaws, and rolls, the direction in which the sensor is pointing changes, causing the FOV of sequential frames to be different3. This creates geometric distortions on the image. Although this distortion is not extremely complex, it can lead to loss of data and it must be taken into consideration. One of the major advantages of spaceborne over airborne sensors is the inherent stability of spaceborne platforms.