Professional Documents
Culture Documents
An Introduction
By
Dean Vestal
Part 1
What is Remote Sensing
• "Remote sensing is the science and art of obtaining information
about an objects, area or phenomena through the analysis of data
acquired by a device that is not in contact with the object, area or
phenomenon under investigation" (Lillesand and Kiefer, 2000).
• The recording devices used in remote sensing include cameras,
digital cameras, spectral scanners, radiometers, lasers, radio
frequency receivers, seismographs, gravimeters, magentometers,
and scintillation counters.
• These instruments are designed to detect and record reflected solar
radiation, emitted terrestrial radiation, or other forms of energy (e.g.
radar, or LIDAR). The form and amount of this energy depend on
the physical, chemical or biological state of the object of interest.
• Examples of remote sensing data include aerial photography,
satellite imagery, radar or lidar data.
Electromagnetic Radiation
• Electromagnetic waves are energy transported through space in the
form of periodic disturbances of electric and magnetic fields. All
electromagnetic waves travel through space at the same speed, c =
2.99792458 x 108 m/s, the speed of light. An electromagnetic wave is
characterized by a frequency and a wavelength.
• The frequency and the wavelength of an electromagnetic wave
depends on its source. There is a wide range of frequencies
encountered in our physical world, ranging from low frequencies of the
electric waves generated by power transmission lines to very high
frequencies of gamma rays originating from atomic nuclei. This wide
frequency range of electromagnetic waves constitute the
Electromagnetic Spectrum.
Electromagnetic Spectrum
solar-synchronous
orbit, altitudes of
600-900 km
Passive Remote Sensing
Feature:
Reflected image only taken
during day time (no clouds).
Feature:
Can acquire images
anytime, regardless of
the time of day or
weather.
Several meters of
penetration ability.
Radar Imagery
Interferometry for Elevation Data
How Interferometry Works
• Each pixel of a radar image contains
information on the phase of the signal back
scattered from the target surface. By
utilizing the geometry provided by two
marginally displaced, coherent
observations of the surface, phase
difference between the two observations,
can be related directly to the altitude of the
antenna above the ground on a pixel by
pixel basis. (The resulting phase difference
image is known as an Interferogramme.)
– Radar Antennas (A1 & A2) are some h
height above a reference surface.
– The distance between A1 & A2 is B.
– The distance between A1 and the point
on the ground being imaged is the
range
– The distance between A2 and the point
on the ground being imaged is the
range
– is the angle between A1 & A2
– is the radar pulse angle
• Active remote sensing: Lidar
LIght Detection And
Ranging,
Also Called Laser Radar,
Applications:
DEMs, Measuring Ozone,
Detecting clouds and
aerosols, Monitoring air
pollution.
Electromagntic
Spectrum
Radar Sensor
• SIR-A, -B, -C (NASA, USA)
• RADARSAT (Canada)
• JERS-1 (radar sensor) (Japan)
• ERS-1 (European)
• AIRSAR/TOPSAR (NASA, USA)
Lidar Sensor
z ALTMS (TerraPoint, USA)
z FLI-MAP (John Chance, USA)
z ALTM (USA)
z TopoEye (USA)
z ATLAS (USA)
Part 2
Analog and Digital Images
• The images collected by remote
sensing may be analog or digital. Aerial
photographs are examples of analog
images while satellite images acquired
using electronic sensors are examples
of digital images.
• A digital image is a two-dimensional
array of pixels saved in a raster format.
Each pixel has an intensity value
(represented by a digital number) and a
location address (referenced by its row
and column numbers).
• The intensity value represents the
measured physical quantity such as the
solar radiance reflected from the
ground. This value is normally the
average value for the whole area
covered by the pixel.
Multilayer Image
• The SPOT HRV sensor operating in the
multispectral mode detects radiations in three
wavelength bands: green (500 - 590 nm), red
(610 - 680 nm) and near infrared (790 - 890
nm). A single SPOT multispectral scene
consists of three raster images representing the
three wavelength bands. Each pixel of the
scene has three intensity values corresponding
to the three bands.
• By "stacking" these images from the same area
together, a multilayer image is formed.
• Multilayer images can also be formed by
combining images obtained from different
sensors, and other subsidiary data. For
example, a multilayer image may consist of
three layers from a SPOT multispectral image, a
layer of ERS synthetic aperture radar image,
and perhaps a layer consisting of the digital
elevation map of the area being studied.
Spatial Resolution
• Spatial resolution refers to the
measure of the smallest object
that can be resolved by the
sensor, or the size of the area
on the ground represented by
each pixel. Spatial resolution is
usually expressed as Ground-
projected Instantaneous Field
of View (GIFOV). A "High
Resolution" image refers to
one with a small resolution
size. Fine details can be seen
in a high resolution image. On
the other hand, a "Low
Resolution" image is one with
a large resolution size.
Spectral Resolution
Spectral resolution is determined by the number and width of spectral intervals
(bands) that a given sensor is capable of recording. The larger number and
narrower spectral bands, the higher the spectral resolution of the data. Based on
spectral resolution we typically divide remote sensing data according to increasing
spectral resolution into panchromatic, multispectral and hyperspectral data.
• The data contains from two to typically no more than 15 spectral bands in
the range from the Visible, through Near-IR, Mid-IR to Thermal-IR range of
electromagnetic spectrum. By combining various bands, we can obtain
unique representations of the study areas for easier qualitative and
quantitative interpretation (for example multispectral classification).
Example of spectral bands recorded by Landsat 7 ETM sensor:
Class No.
(Color in Land Cover Type
Map)
1 (black) Clear water
Dense Forest with closed
2 (green)
canopy
3 (yellow) Shrubs, Less dense forest
4 (orange) Grass
5 (cyan) Bare soil, built-up areas
Turbid water, bare soil, built-up
6 (blue)
areas
7 (red) bare soil, built-up areas
8 (white) bare soil, built-up areas
Supervised Classification
• In supervised classification, the spectral features of areas of known land
cover types are selected from the image. These areas are known as the
"training areas". Every pixel in the whole image is then classified as
belonging to one of the classes depending on how close its spectral features
are to the spectral features of the training areas.