You are on page 1of 75

1

Republic of Iraq
Ministry of Higher Education
and Scientific Research
University of Technology



Third Class
First Edition (2010)
Laser Branch
Department of Applied Sciences
University of Technology

Dr. Abdulrahman K. Ali





REMOTE SENSING
2

Remote Sensing: is the collection of information relating to objects without
being in physical contact with them. Thus our eyes and ears are remote sensors,
and the same is true for cameras and microphones and for many instruments
used for all kinds of applications
Or, said another way:
Remote sensing is the process of acquiring data/information about
objects/substances not in direct contact with the sensor, by gathering its inputs
using electromagnetic radiation or acoustical waves that emanate from the
targets of interest. An aerial photograph is a common example of a remotely
sensed (by camera and film, or now digital) product.
Introduction
The sun is a source of energy or radiation, which provides a very convenient
source of energy for remote sensing. The sun's energy is either reflected, as it is
for visible wavelengths, or absorbed and then reemitted, as it is for thermal
infrared wavelengths.
There are two main types of remote sensing: Passive remote sensing and
Active remote sensing.







1-Passive sensors detect natural radiation that is emitted or reflected by the
object or surrounding area being observed. Reflected sunlight is the most
Active Passive
3

common source of radiation measured by passive sensors. Examples of passive
remote sensors include film photography, infrared, and radiometers.
2-Active remote sensing, on the other hand, emits energy in order to scan
objects and areas whereupon a sensor then detects and measures the radiation
that is reflected or backscattered from the target. RADAR is an example of
active remote sensing where the time delay between emission and return is
measured, establishing the location, height, speeds and direction of an object.
Overview
Remote sensing makes it possible to collect data on dangerous or
inaccessible areas. Remote sensing applications include monitoring
deforestation in areas such as the Amazon Basin, the effects of climate change
on glaciers and Arctic and Antarctic regions, and depth sounding of coastal and
ocean depths. Military collection during the cold war made use of stand-off
collection of data about dangerous border areas. Remote sensing also replaces
costly and slow data collection on the ground, ensuring in the process that areas
or objects are not disturbed.
Applications of Remote Sensing
There are probably hundreds of applications - these are typical:
Meteorology - Study of atmospheric temperature, pressure, water vapour, and
wind velocity.
Oceanography: Measuring sea surface temperature, mapping ocean currents,
and wave energy spectra and depth sounding of coastal and ocean depths
Glaciology- Measuring ice cap volumes, ice stream velocity, and sea ice
distribution. (Glacial)
Geology- Identification of rock type, mapping faults and structure.
Geodesy- Measuring the figure of the Earth and its gravity field.
4

Topography and cartography - Improving digital elevation models.
Agriculture Monitoring the biomass of land vegetation
Forest- monitoring the health of crops, mapping soil moisture
Botany- forecasting crop yields.
Hydrology- Assessing water resources from snow, rainfall and underground
aquifers.
Disaster warning and assessment - Monitoring of floods and landslides,
monitoring volcanic activity, assessing damage zones from natural disasters.
Planning applications - Mapping ecological zones, monitoring deforestation,
monitoring urban land use.
Oil and mineral exploration- Locating natural oil seeps and slicks, mapping
geological structures, monitoring oil field subsidence.
Military- developing precise maps for planning, monitoring military
infrastructure, monitoring ship and troop movements
Urban- determining the status of a growing crop
Climate- the effects of climate change on glaciers and Arctic and Antarctic
regions
Sea- Monitoring the extent of flooding
Rock- Recognizing rock types
Space program- is the backbone of the space program
Seismology: as a premonition.



Principles and Process of Remote Sensing
5

Remote sensing actually done from satellites as Landsat or airplane or on
the ground. To repeat the essence of the definition above, remote sensing uses
instruments that house sensors to view the spectral, spatial and radiometric
relations of observable objects and materials at a distance. Most sensing modes
are based on sampling of photons corresponding frequency in the
electromagnetic (EM) spectrum.
In much of remote sensing, the process involves an interaction between
incident radiation and the targets of interest. This is exemplified by the use of
imaging systems where the following seven elements are involved. Note,
however that remote sensing also involves the sensing of emitted energy and the
use of non-emitted sensors.










i. Energy Source or Illumination (A) - The first requirement for remote
sensing is to have an energy source which illuminates or provides
electromagnetic energy to the target of interest.
ii. Radiation and the Atmosphere (B) - As the energy travels from its source
to the target, it will come in contact with and interact with the atmosphere it
passes through. This interaction may take place a second time as the energy
travels from the target to the sensor.
6

iii. Interaction with the Target (C) - Once the energy makes its way to the
target through the atmosphere, it interacts with the target depending on the
properties of both the target and the radiation.
iv. Recording of Energy by the Sensor (D) - After the energy has been
scattered by, or emitted from the target, we require a sensor (remote - not in
contact with the target) to collect and record the electromagnetic radiation.
v. Transmission, Reception, and Processing (E) - The energy recorded by the
sensor has to be transmitted, often in electronic form, to a receiving and
processing station where the data are processed.
vi. Interpretation and Analysis (F) - The processed image is interpreted,
visually and/or digitally or electronically, to extract information about the
target, which was illuminated.
vii. Application (G) - The final element of the remote sensing process is
achieved when we apply the information we have been able to extract from the
imagery about the target in order to better understand it, reveal some new
information, or assist in solving a particular problem.
Types of Remote Sensing System
1- Visual remote sensing system
The human visual system is an example of a
remote sensing system in the general sense. The
sensors in this example are the two types of
photosensitive cells, known as the cones and
the rods, at the retina of the eyes. The cones are
responsible for colour vision. There are three
types of cones, each being sensitive to one of the red, green, and blue regions of
the visible spectrum. Thus, it is not coincidental that the modern computer
display monitors make use of the same three primary colours to generate a
multitude of colours for displaying colour images. The cones are insensitive
7

under low light illumination condition, when their jobs are taken over by the
rods. The rods are sensitive only to the total light intensity. Hence, everything
appears in shades of grey when there is insufficient light. As the objects/events
being observed are located far away from the eyes, the information needs a
carrier to travel from the object to the eyes. In this case, the information carrier
is the visible light, a part of the electromagnetic spectrum. The objects
reflect/scatter the ambient light falling onto them. Part of the scattered light is
intercepted by the eyes, forming an image on the retina after passing through the
optical system of the eyes. The signals generated at the retina are carried via the
nerve fibres to the brain, the central processing unit (CPU) of the visual system.
These signals are processed and interpreted at the brain, with the aid of previous
experiences. The visual system is an example of a "Passive Remote Sensing"
system which depends on an external source of energy to operate. We all know
that this system won't work in darkness.
2- Optical Remote Sensing
In Optical Remote Sensing, optical sensors detect
solar radiation reflected or scattered from the
earth, forming images resembling photographs
taken by a camera high up in space. The
wavelength region usually extends from the
visible and near infrared VNIR to the short-wave
infrared SWIR. Different materials such as water, soil, vegetation, buildings and
roads reflect visible and infrared light in different ways. They have different
colours and brightness when seen under the sun. The interpretations of optical
images requires the knowledge of the spectral reflectance signatures of the
various materials (natural or man-made) covering the surface of the earth.

8

3-Infrared Remote Sensing
Infrared remote sensing makes use of infrared sensors to detect infrared
radiation emitted from the Earth's surface. The middle-wave infrared (MWIR)
and long-wave infrared (LWIR) are within the thermal infrared region. These
radiations are emitted from warm objects such as the Earth's surface. They are
used in satellite remote sensing for measurements of the earth's land and sea
surface temperature. Thermal infrared remote sensing is also often used for
detection of forest fires, volcanoes, oil fires.







4-Microwave Remote Sensing
There are some remote sensing satellites which carry
passive or active microwave sensors. The active
sensors emit pulses of microwave radiation to
illuminate the areas to be imaged. Images of the
earth surface are formed by measuring the
microwave energy scattered by the ground or sea
back to the sensors. These satellites carry their own "flashlight" emitting
microwaves to illuminate their targets. The images can thus be acquired day
9

and night. Microwaves have an additional advantage as they can penetrate clouds.
Images can be acquired even when there are clouds covering the earth surface. A
microwave imaging system which can produce high resolution image of the Earth
is the synthetic aperture radar (SAR). Electromagnetic radiation in the
microwave wavelength region is used in remote sensing to provide useful
information about the Earth's atmosphere, land and ocean. When microwaves
strike a surface, the proportion of energy scattered back to the sensor depends
on many factors:
Physical factors such as the dielectric constant of the surface materials
which also depends strongly on the moisture content;
Geometric factors such as surface roughness, slopes, orientation of the
objects relative to the radar beam direction;
The types of landcover (soil, vegetation or man-made objects).
Microwave frequency, polarisation and incident angle.
5-Radar Remote Sensing
Using radar, geographers can effectively map out
the terrain of a territory. Radar works by sending
out radio signals, and then waiting for them to
bounce off the ground and return. By measuring the
amount of time it takes for the signals to return, it is
possible to create a very accurate topographic map.
An important advantage to using radar is that it can penetrate thick clouds and
moisture. This allows scientists to accurately map areas such as rain forests,
which are otherwise too obscured by clouds and rain. Imaging radar systems are
versatile sources of remotely sensed images, providing daynight, all-weather
imaging capability. Radar images are used to map landforms and geologic
structure, soil types, vegetation and crops, and ice and oil slicks on the ocean
surface.
10

Synthetic Aperture Radar (SAR)
In synthetic aperture radar (SAR) imaging, microwave pulses are transmitted by
an antenna towards the earth surface. The microwave energy scattered back to
the spacecraft is measured. The SAR makes use of the radar principle to form
an image by utilising the time delay of the backscattered signals. In real aperture
radar imaging, the ground resolution is limited by the size of the microwave
beam sent out from the antenna.






6-Satellite Remote Sensing
In this, you will see many remote sensing images
acquired by earth observation satellites. These
remote sensing satellites are equipped with
sensors looking down to the earth. They are the
"eyes in the sky" constantly observing the earth as
they go round in predictable orbits. Orbital
platforms collect and transmit data from different
parts of the electromagnetic spectrum, which in
conjunction with larger scale aerial or ground-based sensing and analysis
provides researchers with enough information to monitor trends. Other uses
include different areas of the earth sciences such as natural resource
management, agricultural fields such as land usage and conservation, and
national security and overhead, ground-based and stand-off collection on border
areas.
11

How Satellites Acquire Images
Satellite sensors record the intensity of electromagnetic radiation (sunlight)
reflected from the earth at different wavelengths. Energy that is not reflected by
an object is absorbed. Each object has its own unique 'spectrum', some of which
are shown in the diagram below.





Remote sensing relies on the fact that particular features of the landscape such
as bush, crop, salt-affected land and water reflect light differently in different
wavelengths. Grass looks green, for example, because it reflects green light and
absorbs other visible wavelengths. This can be seen as a peak in the green band
in the reflectance spectrum for green grass above. The spectrum also shows that
grass reflects even more strongly in the infrared part of the spectrum. While this
can't be detected by the human eye, it can be detected by an infrared sensor.
Instruments mounted on satellites detect and record the energy that has been
reflected. The detectors are sensitive to particular ranges of wavelengths, called
'bands'. The satellite systems are characterised by the bands at which they
measure the reflected energy. The Landsat TM satellite, which provides the data
used in this project, has bands at the blue, green and red wavelengths in the
visible part of the spectrum and at three bands in the near and mid infrared part
of the spectrum and one band in the thermal infrared part of the spectrum. The
satellite detectors measure the intensity of the reflected energy and record it.
12

7- Airborne Remote Sensing
In airborne remote sensing, downward or sideward
looking sensors are mounted on an aircraft to obtain
images of the earth's surface. An advantage of
airborne remote sensing, compared to satellite
remote sensing, is the capability of offering very
high spatial resolution images (20 cm or less). The
disadvantages are low coverage area and high cost
per unit area of ground coverage. It is not cost-effective to map a large area
using an airborne remote sensing system. Airborne remote sensing missions are
often carried out as one-time operations, whereas earth observation satellites
offer the possibility of continuous monitoring of the earth.
8-Acoustic and near-acoustic remote sensing
Sonar: passive sonar, listening for the sound made
by another object (a vessel, a whale etc); active
sonar, emitting pulses of sounds and listening
for echoes, used for detecting, ranging and
measurements of underwater objects and terrain.
Seismograms taken at different locations can locate and measure
earthquakes (after they occur) by comparing the relative intensity and
precise timing.



13

Electromagnetic Waves




Electromagnetic waves are energy transported through space in the form of
periodic disturbances of electric and magnetic fields. All electromagnetic waves
travel through space at the same speed, c = 2.99792458 x 10
8
m/s, commonly
known as the speed of light. An electromagnetic wave is characterized by a
frequency and a wavelength.
Photons
According to quantum physics, the energy of an electromagnetic wave is
quantized, i.e. it can only exist in discrete amount. The basic unit of energy for
an electromagnetic wave is called a photon. The energy E of a photon is
proportional to the wave frequency ,
= =


Where the constant h is the Planck's Constant, h = 6.626 x 10
-34
J s.
The frequency (and hence, the wavelength) of an electromagnetic wave depends
on its source. There is a wide range of frequency encountered in our physical
world, ranging from the low frequency of the electric waves generated by the
power transmission lines to the very high frequency of the gamma rays
originating from the atomic nuclei. These wide frequency ranges of
electromagnetic waves constitute the Electromagnetic Spectrum. The
electromagnetic spectrum can be divided into several wavelength (frequency)
14

regions, among which only a narrow band from about 400 to 700 nm is visible to
the human eyes. Note that there is no sharp boundary between these regions. The
boundaries shown in the figures are approximate and there are overlaps between
two adjacent regions. Wavelength units: 1mm = 1000 m; 1m=1000 nm.

1-Gamma Rays <0.30 nm: This range is completely absorbed by the upper
atmosphere and not available for remote sensing.
2-X-Rays 0.0330.0 nm: This range is completely absorbed by the atmosphere
and not employed in remote sensing.
3-Ultraviolet: 0.030.40 m
i-Hard UV 0.030.3 m: This range is completely absorbed by the
atmosphere and not employed in remote sensing.
ii-Photographic UV 0.300.40 m: This range is not absorbed by the
atmosphere and detectable with film and photo detectors but with severe
atmospheric scattering.
4-Visible Light: This narrow band of electromagnetic radiation extends from
about 400 nm (violet) to about 700 nm (red). Its Available for remote sensing the
Earth, can be imaged with photographic film.
Violet: 400 - 430 nm
Indigo: 430 - 450 nm
Blue: 450 - 500 nm: Because water increasingly absorbs electromagnetic (EM)
radiation at longer wavelengths, band 1 provides the best data for mapping depth-
15

detail of water-covered areas. It is also used for soil-vegetation discrimination,
forest mapping, and distinguishing cultural features
Green: 500 - 570 nm: The blue-green region of the spectrum corresponds to the
chlorophyll absorption of healthy vegetation and is useful for mapping detail such
as depth or sediment in water bodies. Cultural features such as roads and buildings
also show up well in this band.
Yellow: 570 - 590 nm
Orange: 590 - 610 nm
Red: 610 - 700 nm: Chlorophyll absorbs these wavelengths in healthy vegetation.
Hence, this band is useful for distinguishing plant species, as well as soil and
geologic boundaries
5-Infrared: 0.7 to 300 m wavelength. This region is sensitive to plant water
content, which is a useful measure in studies of vegetation health. This band is also
used for distinguishing clouds, snow, and ice, mapping geologic formations and
soil boundaries. It is also responsive to plant and soil moisture content. This region
is further divided into the following bands:
a-Near Infrared (NIR): 0.7 to 1.5 m.
b-Short Wavelength Infrared (SWIR): 1.5 to 3 m.
d-Mid Wavelength Infrared (MWIR): 3 to 8 m.
e-Long Wavelength Infrared (LWIR): 8 to 15 m.
f-Far Infrared (FIR): longer than 15 m.
The NIR and SWIR are also known as the Reflected I nfrared, referring to the main
infrared component of the solar radiation reflected from the earth's surface. The
MWIR and LWIR are the Thermal Infrared.
6-Microwaves (Radar) 1 mm to 1 m wavelength. Microwaves can penetrate
clouds, fog, and rain. Images can be acquired in the active or passive mode. Radar
is the active form of microwave remote sensing. Radar images are acquired at
various wavelength bands
7-Radio and TV Waves: 10 cm to 10 km wavelength. The longest-wavelength
portion of the electromagnetic spectrum.
16






17





18

Energy interaction with targets
Where EM energy is incident upon any object
there are three fundamental energy interactions
that are possible. Various fractions of the
incident energy are reflected, absorbed and/or
transmitted.
Figure show Interaction of electromagnetic energy
with a target
By applying the principle of conservation of energy we can state the
interrelationship between these interactions as: E
I
= E
R
+ E
A
+ ET
Where E
I
denotes the incident energy, E
R
denotes the reflected energy,
E
A
denotes the absorbed energy and E
T
denotes the transmitted energy, and
with all energy components being a function of wavelength. Three points
concerning this relationship should be noted: First, the proportions of energy
reflected, absorbed and transmitted will vary for different targets depending on
their material type and condition. These differences permit us to distinguish
between objects in an image; bright objects, such as sand, have higher
reflectance than dull objects, such as tarmac. Second, the proportion of
reflected, absorbed and transmitted energy for a target will vary with
wavelength, For example, an object with high absorption at green and red
wavelengths and high reflectance at `blue wavelengths will appear with a
blue colour to the human eye (Figure a). Green objects such as grass have
higher reflectance at green wavelengths than at blue or red wavelengths (Figure
b). Finally, let us consider the relative reflectance from a yellow object. Yellow
is the product of EM energy at red and green wavelengths, hence a yellow
object will have high reflectance at these wavelengths and relatively high
absorption at blue wavelengths (Figure c). This is the principle of multispectral
reflectance. (Figure-Right) illustrates combinations of primary (visible)
wavelengths.
19










All remote sensing systems measure the fraction of reflected energy for specific
illumination and view angles. The full term for a measurement at a specified
geometry is the bidirectional reflectance, such that the set of measurements at
all geometries describes the bidirectional reflectance distribution function
BRDF. However, in this topic we will use the term reflectance for simplicity
and, unless otherwise stated, we will assume sensor viewing a target at nadir.
Before getting too carried away with the amazing powers of human eyesight
you should remember that your vision is restricted to the visible part of the
spectrum. You are unable to exploit differences in the reflectance of targets at
other wavelengths such as infrared or ultraviolet. The principle of conservation
of energy applies at all wavelengths, however, and therefore by building
instruments that record the level of reflected radiation we are able to exploit the
information content across the entire EM spectrum. Third, note that in order to
interpret multispectral images we need to understand the reflectance, absorption
and transmittance properties of typical Earth surfaces, such as soil, water and
vegetated surfaces, at these wavelengths.


20

Solar Irradiation
Optical remote sensing depends on the sun as the sole source of illumination.
The solar irradiation spectrum above the atmosphere can be modelled by a black
body radiation spectrum having a source temperature of 5900 K, with a peak
irradiation located at about 500 nm wavelength. Physical measurement of the
solar irradiance has also been performed using ground based and spaceborne
sensors. After passing through the atmosphere, the solar irradiation spectrum at
the ground is modulated by the atmospheric transmission windows. Significant
energy remains only within the wavelength range from about 0.25 to 3 m as
shown in figure.








The Earth's Atmosphere
The earth's surface is covered by a layer of atmosphere consisting of a mixture
of gases and other solid and liquid particles. The gaseous materials extend to
several hundred kilometres in altitude, though there is no well defined boundary
for the upper limit of the atmosphere. The first 80 km of the atmosphere
contains more than 99% of the total mass of the earth's atmosphere. The vertical
profile of the atmosphere is divided into four layers: troposphere,
stratosphere, mesosphere and thermosphere. The tops of these layers are
21

known as the tropopause, stratopause, mesopause and thermopause,
respectively.
Troposphere: This layer is characterized by a decrease in temperature
with respect to height, at a rate of about 6.5C per kilometer, up to a
height of about 10 km. All the weather activities (water vapour, clouds,
precipitation) are confined to this layer. A layer of aerosol particles
normally exists near to the earth surface. The aerosol concentration
decreases nearly exponentially with height, with a characteristic height of
about 2 km. The term upper atmosphere usually refers to the region of
the atmosphere above the troposphere.
Stratosphere: The temperature at the lower 20 km of the stratosphere is
approximately constant, after which the temperature increases with
height, up to an altitude of about 50 km. Ozone exists mainly at the
stratopause. The troposphere and the stratosphere together account for
more than 99% of the total mass of the atmosphere.
Mesosphere: The temperature decreases in this layer from an altitude of
about 50 km to 85 km.







22

meteor
Thermosphere: This layer extends from about 85 km upward to several
hundred kilometres. The temperature may range from 500 K to 2000 K.
The gases exist mainly in the form of thin plasma, i.e. they are ionized
due to bombardment by solar ultraviolet radiation and energetic cosmic
rays. Many remote sensing satellites follow the near polar sun-
synchronous orbits at a height around 800 km, which is well above the
thermopause.

Atmospheric Constituents
When electromagnetic radiation travels through the atmosphere, it may be
absorbed or scattered by the constituent particles of the atmosphere. Molecular
absorption converts the radiation energy into excitation energy of the molecules.
Scattering redistributes the energy of the incident beam to all directions. The
overall effect is the removal of energy from the incident radiation. The various
effects of absorption and scattering are outlined in the following sections. The
atmosphere consists of the following components:
Permanent Gases: They are gases present in nearly constant
concentration, with little spatial variation. About 78% by volume of the
atmosphere is nitrogen while the life- sustaining oxygen occupies 21%.
The remaining one percent consists of the inert gases, carbon dioxide and
other gases.
Gases with Variable Concentration: The concentration of these gases
may vary greatly over space and time. They consist of water vapour,
ozone, nitrogenous and sulphurous compounds.
23

Solid and liquid particulates: Other than the gases, the atmosphere also
contains solid and liquid particles such as aerosols, water droplets and ice
crystals. These particles may congregate to form clouds and haze.
Ozone Layers: Ozone in the stratosphere absorbs about 99% of the
harmful solar UV radiation shorter than 320 nm. It is formed in three-
body collisions of atomic oxygen (O) with molecular oxygen (O
2
) in the
presence of a third atom or molecule. The ozone molecules also undergo
photochemical dissociation to atomic O and molecular O
2
. When the
formation and dissociation processes are in equilibrium, ozone exists at a
constant concentration level. However, existence of certain atoms (such
as atomic chlorine) will catalyse the dissociation of O
3
back to O
2
and the
ozone concentration will decrease. It has been observed by measurement
from space platforms that the ozone layers are depleting over time,
causing a small increase in solar ultraviolet radiation reaching the earth.
In recent years, increasing use of the fluorocarbon compounds in aerosol
sprays and refrigerant results in the release of atomic chlorine into the
upper atmosphere due to photochemical dissociation of the fluorocarbon
compounds, contributing to the depletion of the ozone layers.
Absorption by Gaseous Molecules
The energy of a gaseous molecule can exist in various forms:
Translational Energy: Energy due to translational motion of the centre
of mass of the molecule. The average translational kinetic energy of a
molecule is equal to kT/2 where k is the Boltzmann's constant and T is
the absolute temperature of the gas.
Rotational Energy: Energy due to rotation of the molecule about an axis
through its centre of mass.
24

Vibrational Energy: Energy due to vibration of the component atoms of
a molecule about their equilibrium positions. This vibration is associated
with stretching of chemical bonds between the atoms.
Electronic Energy: Energy due to the energy states of the electrons of
the molecule.
The last three forms are quantized, i.e. the energy can change only in discrete
amount, known as the transitional energy. A photon of electromagnetic
radiation can be absorbed by a molecule when its frequency matches one of the
available transitional energies.

Solar Radiation in the Atmosphere
In satellite remote sensing of the earth, the sensors are looking through a layer
of atmosphere separating the sensors from the Earth's surface being observed.
Hence, it is essential to understand the effects of atmosphere on the
electromagnetic radiation travelling from the Earth to the sensor through the
atmosphere. The atmospheric constituents cause wavelength dependent
absorption and scattering of radiation. These effects degrade the quality of
images. Some of the atmospheric effects can be corrected before the images are
subjected to further analysis and interpretation.
A consequence of atmospheric absorption is that certain wavelength bands in
the electromagnetic spectrum are strongly absorbed and effectively blocked by
the atmosphere. The wavelength regions in the electromagnetic spectrum usable
for remote sensing are determined by their ability to penetrate atmosphere.
These regions are known as the atmospheric transmission windows. Remote
sensing systems are often designed to operate within one or more of the
atmospheric windows. These windows exist in the microwave region, some
wavelength bands in the infrared, the entire visible region and part of the near
ultraviolet regions.
25

Atmosphere Effects
Our eyes inform us that the atmosphere is essentially transparent to light, and
we tend to assume that this condition exists for all Electromagnetic radiation. In
fact, however, the gases of the atmosphere selectively scatter light of different
wavelengths. The gases also absorb Electromagnetic energy at specific
wavelength intervals called absorption bands. The intervening regions of high
energy transmittance are called atmospheric transmission bands, or windows.
The transmission and absorption bands are shown in the following figure,
together with the gases responsible for the absorption bands. Particles and gases
in the atmosphere can affect the incoming light and radiation. These effects are
caused by the mechanisms of, Transmittance, Scattering and Absorption.
1. Transmittance-Some radiation penetrates through atmosphere, water, or
other materials.
Atmospheric Transmission Windows
Each type of molecule has its own set of absorption bands in various parts of the
electromagnetic spectrum. As a result, only the wavelength regions outside the
main absorption bands of the atmospheric gases can be used for remote sensing.
These regions are known as the Atmospheric Transmission Windows. The
wavelength bands used in remote sensing systems are usually designed to fall
within these windows to minimize the atmospheric absorption effects. These
windows are found in the visible, near-infrared, certain bands in thermal infrared
and the microwave regions.



26

2. Scattering: Scattering occurs when particles or large gas molecules present
in the atmosphere interact with and cause the electromagnetic radiation to be
redirected from its original path. How much scattering takes place depends on
several factors including the wavelength of the radiation, the abundance of
particles or gases, and the distance the radiation travels through the atmosphere.
There are three types of scattering which take place Rayleigh, Mie, and non-
selective scattering, which absorbance and re-emittance of EM energy by
particles without changing wavelength.
i-Rayleigh scattering: occurs when particles are very small compared to the
wavelength of the radiation. These could be particles such as small specks of
dust or nitrogen and oxygen molecules. Rayleigh scattering causes shorter
wavelengths of energy to be scattered much more than longer wavelengths.
Rayleigh scattering is the dominant scattering mechanism in the upper
atmosphere. The fact that the sky appears "blue" during the day is because of
this phenomenon. As sunlight passes through the atmosphere, the shorter
wavelengths (i.e. blue) of the visible spectrum are scattered more than the other
(longer) visible wavelengths. At sunrise and sunset the light has to travel farther
through the atmosphere than at midday and the scattering of the shorter
wavelengths is more complete; this leaves a greater proportion of the longer
wavelengths to penetrate the atmosphere (thus the sky is painted in red).
ii-Mie scattering: occurs when the particles are just about the same size as
the wavelength of the radiation. Dust, pollen, smoke and water vapour are
common causes of Mie scattering which tends to affect longer wavelengths than
those affected by Rayleigh scattering. Mie scattering occurs mostly in the lower
portions of the atmosphere where larger particles are more abundant, and
dominates when cloud conditions are overcast.

27

iii- Nonselective scattering
This occurs when the particles are much larger than the wavelength of the
radiation. Water droplets and large dust particles can cause this type of
scattering. Nonselective scattering gets its name from the fact that all
wavelengths are scattered about equally. This type of scattering causes fog and
clouds to appear white to our eyes because blue, green, and red light are all
scattered in approximately equal quantities (blue+ green+ red light = white
light).

3. Absorption
Absorption is the other main mechanism at work when electromagnetic
radiation interacts with the atmosphere. Some radiation is absorbed through
electron or molecular reactions within the medium encountered; a portion of the
energy incorporated can then be re-emitted (as emittance), largely at longer
wavelengths, so that some of the sun's radiant energy engages in heating the
target giving rise then to a thermal response. In contrast to scattering, this
phenomenon causes molecules in the atmosphere to absorb energy at various
wavelengths. Ozone, carbon dioxide, and water vapour are the three main
atmospheric constituents which absorb radiation. Any effort to measure the
spectral properties of a material through a planetary atmosphere, must consider
where the atmosphere absorbs.
I- X-Ray and Gama Ray Absorption
This range of X-Ray and Gama Ray is completely absorbed by the atmosphere
and not receive the earth.
II-Ultraviolet Absorption
Absorption of ultraviolet (UV) in the atmosphere is chiefly due to electronic
transitions of the atomic and molecular oxygen and nitrogen. Due to the
28

ultraviolet absorption, some of the oxygen and nitrogen molecules in the upper
atmosphere undergo photochemical dissociation to become atomic oxygen and
nitrogen. These atoms play an important role in the absorption of solar
ultraviolet radiation in the thermosphere. The photochemical dissociation of
oxygen is also responsible for the formation of the ozone layer in the
stratosphere.

III-Visible Region Absorption
There is little absorption of the electromagnetic radiation in the visible part of
the spectrum.
Iv-Infrared Absorption
The absorption in the infrared (IR) region is mainly due to rotational and
vibrational transitions of the molecules. The main atmospheric constituents
responsible for infrared absorption are water vapour (H
2
O) and carbon dioxide
(CO
2
) molecules. The water and carbon dioxide molecules have absorption
bands centred at the wavelengths from near to long wave infrared (0.7 to 15
m). In the far infrared region, most of the radiation is absorbed by the
atmosphere.
V-Microwave Region Absorption
The atmosphere is practically transparent to the microwave radiation.




29

Spectral Reflectance Signature
When solar radiation hits a target surface, it may be transmitted, absorbed or
reflected. Different materials reflect and absorb differently at different
wavelengths. Some radiation reflected away from the target at different angles
(depending in part on surface "roughness" as well as on the angle of the sun's
direct rays relative to surface inclination), and some being directed back on line
with the observing sensor. Most remote sensing systems are designed to monitor
reflected radiation. The reflectance spectrum of a material is a plot of the
fraction of radiation reflected as a function of the incident wavelength and
serves as a unique signature for the material. In principle, a material can be
identified from its spectral reflectance signature if the sensing system has
sufficient spectral resolution to distinguish its spectrum from those of other
materials. This premise provides the basis for multispectral remote sensing.
The following graph shows the typical reflectance spectra of five materials:
clear water, turbid water, bare soil and two types of vegetation.


Fig. Reflectance Spectrum of Five Types of Landcover
30

The reflectance of clear water is generally low. However, the reflectance is
maximum at the blue end of the spectrum and decreases as wavelength
increases. Hence, clear water appears dark-bluish. Turbid water has some
sediment suspension which increases the reflectance in the red end of the
spectrum, accounting for its brownish appearance. The reflectance of bare soil
generally depends on its composition. In the example shown, the reflectance
increases monotonically with increasing wavelength. Hence, it should appear
yellowish-red to the eye. Vegetation has a unique spectral signature which
enables it to be distinguished readily from other types of land cover in an
optical/near-infrared image. The reflectance is low in both the blue and red
regions of the spectrum, due to absorption by chlorophyll for photosynthesis. It
has a peak at the green region which gives rise to the green colour of vegetation.
In the near infrared (NIR) region, the reflectance is much higher than that in
the visible band due to the cellular structure in the leaves. Hence, vegetation can
be identified by the high NIR but generally low visible reflectance. This
property has been used in early reconnaissance missions during war times for
"camouflage detection".
The shape of the reflectance spectrum can be used for identification of
vegetation type. For example, the reflectance spectra of vegetation 1 and 2 in
the above figures can be distinguished although they exhibit the generally
characteristics of high NIR but low visible reflectances. Vegetation 1 has
higher reflectance in the visible region but lower reflectance in the NIR region.
For the same vegetation type, the reflectance spectrum also depends on other
factors such as the leaf moisture content and health of the plants.




31

Image Processing
Pictures are the most common and convenient means of conveying or
transmitting information. A picture is worth a thousand words. Pictures
concisely convey information about positions, sizes and inter-relationships
between objects. They portray spatial information that we can recognize as
objects. Human beings are good at deriving information from such images,
because of our innate visual and mental abilities.
Analog and Digital Images
An image is a two-dimensional representation of objects in a real scene. Remote
sensing images are representations of parts of the earth surface as seen from
space. The images may be analog or digital. Aerial photographs are examples of
analog images while satellite images acquired using electronic sensors are
examples of digital images. Digital image is a two-dimensional array of pixels.
Each pixel has an intensity value (represented by a digital number) and a
location address (referenced by its row and column numbers).








32

Pixels
A digital image comprises of a two dimensional array of individual picture
elements called pixels arranged in columns and rows. Each pixel represents an
area on the Earth's surface. A pixel has an intensity value and a location
address in the two dimensional image.
The intensity value represents the measured physical quantity such as the solar
radiance in a given wavelength band reflected from the ground, emitted infrared
radiation or backscattered radar intensity. This value is normally the average
value for the whole ground area covered by the pixel.
The intensity of a pixel is digitised and recorded as a digital number. Due to the
finite storage capacity, a digital number is stored with a finite number of bits
(binary digits). The number of bits determines the radiometric resolution of
the image. For example, an 8-bit digital number ranges from 0 to 255 (i.e. 2
8
-
1), while a 11-bit digital number ranges from 0 to 2047. The detected intensity
value needs to be scaled and quantized to fit within this range of value. In a
Radiometrically Calibrated Image, the actual intensity value can be derived
from the pixel digital number.
Multilayer Image
Several types of measurement may be made from the ground area covered by a
single pixel. Each type of measurement forms images which carry some specific
information about the area. By "stacking" these images from the same area
together, a multilayer image is formed. Each component image is a layer in the
multilayer image. Multilayer images can also be formed by combining images
obtained from different sensors, and other subsidiary data. For example, a
multilayer image may consist of three layers from a SPOT multispectral image,
a layer of synthetic aperture radar SAR image, and perhaps a layer consisting of
the digital elevation map of the area being studied.
33

Green Red NIR
Fig:An illustration of a multilayer image
consisting of five component layers.


Multispectral Images
A multispectral image consists of several bands of data. For visual display, each
band of the image may be displayed one band at a time as a grey scale image, or in
combination of three bands at a time as a colour composite image. Interpretation of
a multispectral colour composite image will require the knowledge of the spectral
reflectance signature of the targets in the scene. In this case, the spectral
information content of the image is utilized in the interpretation. The following
three images show the three bands of a multispectral image extracted from a SPOT
multispectral scene at a ground resolution of 20 m. The area covered is the same as
that shown in the above panchromatic image. Note that both the XS1 (green) and
XS2 (red) bands look almost identical to the panchromatic image shown above. In
contrast, the vegetated areas now appear bright in the XS3 (NIR) band due to high
reflectance of leaves in the near infrared wavelength region. Several shades of grey
can be identified for the vegetated areas, corresponding to different types of
vegetation. Water mass (both the river and the sea) appear dark in the XS3 (near
IR) band.






34

Superspectral Image
The more recent satellite sensors are capable of acquiring images at many more
wavelength bands. For example, several satellites consist of 36 spectral bands,
covering the wavelength regions ranging from the visible, near infrared, short-
wave infrared to the thermal infrared. The bands have narrower bandwidths,
enabling the finer spectral characteristics of the targets to be captured by the
sensor. The term "superspectral" has been coined to describe such sensors.

Hyperspectral Image
A hyperspectral image consists of about a hundred or more contiguous spectral
bands forming a three-dimensional (two spatial dimensions and one spectral
dimension) image cube.. The characteristic spectrum of the target pixel is
acquired in a hyperspectral image. The precise spectral information contained in
a hyperspectral image enables better characterisation and identification of
targets. Hyperspectral images have potential applications in such fields as
precision agriculture (e.g. monitoring the types, health, moisture status and
maturity of crops), coastal management (e.g. monitoring of phytoplanktons,
pollution, bathymetry changes).








35

Images Resolutions
The quality of remote sensing data consists of its spectral, radiometric, spatial
and temporal resolutions.
1-Spatial Resolution
Spatial resolution refers to the size of the smallest object that can be resolved on
the ground. In a digital image, the resolution is limited by the pixel size, i.e. the
smallest resolvable object cannot be smaller than the pixel size. The intrinsic
resolution of an imaging system is determined primarily by the instantaneous field
of view (IFOV) of the sensor, which is a measure of the ground area viewed by a
single detector element in a given instant in time. However this intrinsic resolution
can often be degraded by other factors which introduce blurring of the image, such
as improper focusing, atmospheric scattering and target motion. The pixel size is
determined by the sampling distance.
A "High Resolution" image refers to one with a small resolution size. Fine details
can be seen in a high resolution image. On the other hand, a "Low Resolution"
image is one with a large resolution size, i.e. only coarse features can be observed
in the image. An image sampled at a small pixel size does not necessarily have a
high resolution. The following three images illustrate this point. The first image is
a SPOT image of 10 m pixel size. It was derived by merging a SPOT
panchromatic image of 10 m resolution with a SPOT multispectral image of 20
m resolution. The merging procedure "colours" the panchromtic image using the
colours derived from the multispectral image. The effective resolution is thus
determined by the resolution of the panchromatic image, which is 10 m. This
image is further processed to degrade the resolution while maintaining the same
pixel size. The next two images are the blurred versions of the image with larger
resolution size, but still digitized at the same pixel size of 10 m. Even though they
have the same pixel size as the first image, they do not have the same resolution.
36





The following images illustrate the effect of pixel size on the visual appearance of
an area. The first image is a SPOT image of 10 m pixel size derived by merging a
SPOT panchromatic image with a SPOT multispectral image. The subsequent
images show the effects of digitizing the same area with larger pixel sizes.





2-Radiometric Resolution
Radiometric Resolution refers to the smallest change in intensity level that can
be detected by the sensing system. The intrinsic radiometric resolution of a
sensing system depends on the signal to noise ratio of the detector. In a digital
image, the radiometric resolution is limited by the number of discrete
quantization levels used to digitize the continuous intensity value. The
following images illustrate the effects of the number of quantization levels on
the digital image. The first image is a SPOT panchromatic image quantized at 8
bits (i.e. 256 levels) per pixel. The subsequent images show the effects of
degrading the radiometric resolution by using fewer quantization levels.
10 m resolution, 10 m pixel
size
30 m resolution, 10 m pixel
size
80 m resolution, 10 m pixel
size
Pixel Size=10m, Image=160x160 pixel Pixel Size=20m, Image=80x80 pixel Pixel Size=80m, Image=20x20 pixel
37







3-Spectral resolution
The wavelength width of the different frequency bands recorded
usually, this is related to the number of frequency bands recorded by the
platform. Current Landsat collection is that of seven bands, including
several in the infra-red spectrum, ranging from a spectral resolution of
0.07 to 2.1 m. The Hyperion sensor on Earth Observing-1 resolves 220
bands from 0.4 to 2.5 m, with a spectral resolution of 0.10 to 0.11 m
per band.

4-Temporal resolution
The frequency of flyovers by the satellite or plane, and is only relevant in
time-series studies or those requiring an averaged or mosaic image as in
deforesting monitoring. This was first used by the intelligence community
where repeated coverage revealed changes in infrastructure, the
deployment of units or the modification/introduction of equipment. Cloud
cover over a given area or object makes it necessary to repeat the
collection of said location.



8-bit quantization (256 levels) 2-bit quantization (4 levels) 1-bit quantization (2 levels)
38

Visual Interpretation
Analysis of remote sensing imagery involves the identification of various
targets in an image, and those targets may be environmental or artificial
features, which consist of points, lines, or areas. Targets may be defined in
terms of the way they reflect or emit radiation. This radiation is measured and
recorded by a sensor, and ultimately is depicted as an image product such as an
Observing the differences between targets and their backgrounds involves
comparing different targets based on any, or all, of the visual elements of tone,
shape, size, pattern, texture, shadow, and association.
1-Tone refers to the relative brightness or
colour of objects in image. Generally, tone is the
fundamental element for distinguishing between
different targets or features. Variations in tone
also allow the elements of shape, texture, and
pattern of objects to be distinguished.
2-Shape refers to the general form, structure, or
outline of individual objects. Shape can be a very
distinctive clue for interpretation. Straight edge
shapes typically represent urban or agricultural
(field) targets, while natural features, such as
forest edges, are generally more irregular in
shape, except where man has created a road or
clear cuts. Farm or crop land irrigated by rotating
sprinkler systems would appear as circular shapes.
3-Size of objects in an image is a function of scale. It is important to assess the
size of a target relative to other objects in a scene, as well as the absolute size,
to aid in the interpretation of that target. A quick approximation of target size
39

can direct interpretation to an appropriate result
more quickly. For example, if an interpreter
had to distinguish zones of land use, and had
identified an area with a number of buildings in
it, large buildings such as factories or
warehouses would suggest commercial
property, whereas small buildings would
indicate residential use.

4-Pattern refers to the spatial arrangement of
visibly discernible objects. Typically an orderly
repetition of similar tones and textures will
produce a distinctive and ultimately
recognizable pattern. Orchards with evenly
spaced trees and urban streets with regularly
spaced houses are good examples of pattern.

5-Texture refers to the arrangement and frequency of tonal variation in
particular areas of an image. Rough textures would consist of a mottled tone
where the grey levels change abruptly in a small area, whereas some of the
textures would have very little tonal variation. Smooth textures are most often
the result of uniform, even surfaces, such as
fields, asphalt, or grasslands. A target with a
rough surface and irregular structure, such as
a forest canopy, results in a rough textured
appearance. Texture is one of the most
important elements for distinguishing
features in radar imagery.
40

6-Shadow is also helpful in interpretation as it may provide an idea of the
profile and relative height of a target or targets which may make identification
easier. However, shadows can also reduce or
eliminate interpretation in their area of
influence, since targets within shadows are
much less (or not at all) discernible from their
surroundings. Shadow is also useful for
enhancing or identifying topography and
landforms, particularly in radar imagery.

7-Association takes into account the relationship between other recognizable
objects or features in proximity to the target of interest. The identification of
features that one would expect to associate with other features may provide
information to facilitate identification. In the
example given above, commercial properties
may be associated with proximity to major
transportation routes, whereas residential
areas would be associated with schools,
playgrounds, and sports fields. In our
example, a lake is associated with boats, a
marina, and adjacent recreational land.






41

Image Correction
1-Radiometric correction
Gives a scale to the pixel values, e. g. the monochromatic scale of 0 to
255 will be converted to actual radiance values.
2-Atmospheric correction
Eliminates atmospheric haze by rescaling each frequency band so that its
minimum value (usually realised in water bodies) corresponds to a pixel
value of 0. The digitizing of data also make possible to manipulate the
data by changing gray-scale values.


Glossary
Albedo: Ratio of the amount of electromagnetic energy (solar radiation)
reflected by a surface to the amount of energy incident upon the surface.
ASTER: Advanced Spaceborne Thermal Emission and Reflection Radiometer.
AVHRR: Advanced very high-resolution radiometer.
AVIRIS: Airborne visible-infrared imaging spectrometer.
Band: Broadcasting frequency within given limits.
Bandwidth: The total range of frequency required to pass a specific modulated
(spectral resolution) signal without distortion or loss of data.
CEO: Center for Observing the Earth from Space at Yale University
ETM+: Enhanced Thematic Mapper Plus
EM: Electromagnetic
GPS: Global Positioning System
GIS: Global Information System
IFOV: Instantaneous field of view: the solid angle through which a detector is
sensitive to radiation
42

IKONOS: A high-resolution earth observation satellite launched in 1999,
which occupies a 682-km sun synchronous orbit and employs linear array
technology collecting data in four multispectral bands at a nominal resolution of
4 m, as well as a 1-m-resolution panchromatic band.
Landsat: A series of unmanned NASA earth resource satellites that acquire
multispectral images in the visible and IR bands.
NAD: North Atlantic Datum
NDVI: Normalized Difference Vegetation Index
NIR Near Infrared Remote sensing of energy naturally reflected or radiated
from the terrain.
Radiation: Act of giving off electromagnetic energy.
RGB: Red, Green, and Bluethe colors used in constructing visible and false
color image representations.
MIR: Mid Infrared
Spatial Resolution: The ability to distinguish between closely spaced objects
on an image. Commonly expressed as the most closely spaced line-pairs per
unit distance distinguishable.
Spectral Reflectance: Reflectance of electromagnetic energy at specified
wavelength intervals.
Spectral Resolution: Range of wavelengths recorded by a detector.
SWIR: Short Wave Infrared
TM: Thematic Mapper
UTM: Universal Transverse Mercator
VI: Vegetation Index
VNIR: Visible Near Infrared
WGS: Worldwide Geographic System
WRS: Worldwide Reference System

1

Republic of Iraq
Ministry of Higher Education
and Scientific Research
University of Technology



Third Class
First Edition (2011)
Laser Branch
Department of Applied Sciences
University of Technology

Dr. Abdulrahman K. Ali





Applications of Remote Sensing
2

There are probably hundreds of applications - these are typical:
Meteorology - Study of atmospheric temperature, pressure, water vapour, etc..
Oceanography: Measuring sea surface temperature, mapping ocean currents, and
wave energy spectra and depth sounding of coastal and ocean depths
Glaciology- Measuring ice cap volumes, ice stream velocity, and sea ice
distribution. (Glacial)
Geology- Identification of rock type, mapping faults and structure.
Geodesy- Measuring the figure of the Earth and its gravity field.
Topography and cartography - Improving digital elevation models.
Agriculture Monitoring the biomass of land vegetation
Forest- monitoring the health of crops, mapping soil moisture
Botany- forecasting crop yields.
Hydrology- Assessing water resources from snow, rainfall and underground
aquifers.
Disaster warning and assessment - Monitoring of floods and landslides,
monitoring volcanic activity, assessing damage zones from natural disasters.
Planning applications - Mapping ecological zones, monitoring deforestation,
monitoring urban land use.
Oil and mineral exploration- Locating natural oil seeps and slicks, mapping
geological structures, monitoring oil field subsidence.
Military- developing precise maps for planning, monitoring military
infrastructure, monitoring ship and troop movements
Urban- determining the status of a growing crop
Climate- the effects of climate change on glaciers and Arctic and Antarctic regions
Sea- Monitoring the extent of flooding
Rock- Recognizing rock types
Space program- is the backbone of the space program
Seismology: as a premonition.
Geographic Information System GIS
3

A Geographic Information System (GIS) integrates hardware, software, and
data for capturing, managing, analyzing, and displaying all forms of
geographically referenced information. GIS also allows the integration of these
data sets for deriving meaningful information and outputting the information
derivatives in map format or tabular format.
Three Views of a GIS
A GIS can be viewed in three ways:
1) The Database View: A GIS is a unique kind of database of the worlda
geographic database (geo database). It is an "Information System for
Geography." Fundamentally, a GIS is based on a structured database that
describes the world in geographic terms.
2) The Map View: A GIS is a set of intelligent maps and other views that show
features and feature relationships on the earth's surface. Maps of the underlying
geographic information can be constructed and used as "windows into the
database" to support queries, analysis, and editing of the information.
3) The Model View: A GIS is a set of information transformation tools that
derive new geographic datasets from existing datasets. These geo-processing
functions take information from existing datasets, apply analytic functions, and
write results into new derived datasets.
By combining data and applying some analytic rules, we can create a model that
helps answer the question you have posed.

Global Positioning System GPS
The Global Positioning System (GPS) is a space-based global navigation
satellite system (GNSS) that provides reliable location and time information in
all weather and at all times and anywhere on or near the Earth when and where
there is an unobstructed line of sight to four or more GPS satellites. It is
4

maintained by the United States government and is freely accessible by anyone
with a GPS receiver.
GPS was created and realized by the U.S. Department of Defence (USDOD)
and was originally run with 24 satellites. It was established in 1973 to overcome
the limitations of previous navigation systems.
Basic concept of GPS
A GPS receiver calculates its position by precisely timing the signals sent by
GPS satellites high above the Earth. Each satellite continually transmits
messages that include
the time the message was transmitted
precise orbital information (the ephemeris)
the general system health and rough orbits of all GPS satellites (the
almanac).
The receiver uses the messages it receives to determine the transit time of each
message and computes the distance to each satellite. These distances along with
the satellites' locations are used with the possible aid of trilateration, depending
on which algorithm is used, to compute the position of the receiver. This
position is then displayed, perhaps with a moving map display or latitude and
longitude; elevation information may be included. Many GPS units show
derived information such as direction and speed, calculated from position
changes.
Three satellites might seem enough to solve for position since space has three
dimensions and a position near the Earth's surface can be assumed. However,
even a very small clock error multiplied by the very large speed of light, the
speed at which satellite signals propagate results in a large positional error.
5

Therefore receivers use four or more satellites to solve for the receiver's location
and time. The very accurately computed time is effectively hidden by most GPS
applications, which use only the location. A few specialized GPS applications
do however use the time; these include time transfer, traffic signal timing, and
synchronization of cell phone base stations.
Although four satellites are required for normal operation, fewer apply in
special cases. If one variable is already known, a receiver can determine its
position using only three satellites. For example, a ship or aircraft may have
known elevation. Some GPS receivers may use additional clues or assumptions
(such as reusing the last known altitude, dead reckoning, inertial navigation, or
including information from the vehicle computer) to give a less accurate
(degraded) position when fewer than four satellites are visible.

1. Application of Remote Sensing and GIS in Civil Engineering
Remote sensing and GIS techniques become potential and indispensable tools
for solving many problems of civil engineering. Remote sensing observations
provides data on earths resources in a spatial format, GIS co-relates different
kinds of spatial data and their attribute data, so as to use them in various fields
of civil engineering.

a- In structural engineering:
Structural Health Monitoring (SHM) provides designers with feedback of
structural performance, assisting in development of structures with higher utility
and lower manufacturing costs. Structural Health Monitoring nowadays
continues to advance from conventional strain gauges to FBG Fibre Optic
Sensors (FOS) and major breakthroughs in wireless remote monitoring. Fibre
optic sensors use optical wavelength of fibre Bragg grating to measure
6

temperature and strain. FOS has many advantages over the traditional electrical
system such as:
Suitable for long-term permanent SHM: monitor structure during construction
stage and whole lifespan as well
No calibration needed
One cable can have hundreds of the sensors
Simple installation
Cable can run kilometres, no length limit
Fibre optic sensors use light signal - no electrical sparking, intrinsically safe
Gauge length can be few metres long to measure global behaviours of
structures
Suitable for both static and dynamic measurement
The primary of monitoring is to ensure the longevity and safety of the structure
as well as optimizing its management. To implement corrective measures and
maintenance action, monitoring must be enable the timely detection of any
condition or behaviour that could deteriorate the structure, deem it unsafe or
potentially results in its failure.
The monitoring programme plays a fundamental role during the construction
phase as it enables the verification of design hypotheses and construction
processes, affecting, in some cases, the construction rate of the structures and
overall quality. Most defects are introduced already at the time of construction.
Monitoring also allows performance evaluation of new materials and
technologies used in bridge construction and rehabilitation. This objective is
easily achieved with fibre optic sensors since these sensors effectively integrate
in new materials such as fibrereinforced polymer composite.
Furthermore, fibre optic sensors adapt perfectly to long-term monitoring of
bridges behaviour as well as short-term monitoring of bridges dynamic
behaviour under traffic load.
7

Finally, monitoring can be used as a tool for supervised lifetime extension of
bridges approaching the end of their life or in need of major repair. It ensures
that such bridges are operated safely while allowing the postponement of major
investments and traffic disruption.

b- Town Planning and Urban Development:
To achieve the objectives of making metropolis cities more livable and of
international standard, a co-coordinated and integrated approach among the
various agencies involved in urban development and provision of services are
needed including participatory process in planning and implementation at local
body levels. As well as to have planned and organized disposal of population
through growth centres, which will acts as counter-magnets to the cities growth.
This growth may not able to withstand the existing infrastructure, traffic, road,
drainage and utility networks etc. Advance urban planning is required for a
planned development of the area for which up to date real time and accurate
information are the vital important. Geographical Information system
& Remote Sensing is inevitable technology in the development of national
Infrastructure and planning and they provide solution related to many
environmental.










8

Applications of Remote Sensing to Hydrology and Hydrogeology
The Hydrological Cycle
A brief overview of hydrological processes will help to set a framework for
describing those areas where remote sensing can assist in observing and in
managing water resource system. Generally speaking, the hydrological cycle
traces water through different physical processes, from liquid water through
evaporation into the atmosphere, back into the liquid (or sometimes the frozen)
state as precipitation falling on land areas either run off into rivers and streams,
or percolate into the soil, or evaporate. Moisture reaching the water table
becomes ground water. As a general rule, both surface and ground water flow
under the force of gravity toward streams and lakes, and ultimately oceans. The
return of water to the oceans can thought of as completing the cycle.










Precipitation
Accurate measurement of precipitation is a continuing goal in meteorological
research and a continuing need in hydrology which depends greatly on these
data for modelling. Ground-based radar is probably the most accurate method of
determining a real precipitation in use today. Satellite images from GOES,
NOAA, TIROS-N, TRMM and NIMBUS opened a whole new world of data on
9

clouds and frontal systems. Work carried out by several researchers has led to
the following conclusions:
A. In thick clouds (more than one kilometer) rain is possible when the upper
surface of the cloud is at less that 15 C.
B. The probability of rain is inversely proportional to the temperature of the
upper surface of the cloud.
C .Precipitation intensity is directly proportional to the area of the upper surface
of the cloud at temperature of less than 15 C.

Snow
For the hydrologists who must forecast water levels, snow represents one of the
most complicated and most difficult to measure parameters. Snow extent,
distribution, water equivalent, water content, thickness and density all play a
large part in assessment of the snow-pack` s contribution to runoff. Snow pack
water equivalent has been measured by aircraft gamma-radiation surveys in the
USA. The method is based on the absorption of natural gamma radiation by
water (snow). As hydrologists come to accept
satellite remote-sensing data on snow mapping, they
also come to learn the limitations of satellite remote
sensing. Despite some indications that the
reflectance of snow may, under certain
circumstances, be related to the snow thickness.
Glaciers Glaciers play an important role in the hydrological cycle of many
mountainous areas. Terrestrial photography of glaciers was an important early
reference method. Traversing and conducting scientific studies on glaciers are
difficult, and glacieologists were quick to appreciate the value of remote
sensing, first from aircraft, later from satellites ( Landsat, HCMM, NIMBUS
and IceSat etc).
10









Surface Water: One of the best known applications of remote sensing to water
resources is the inventorying of surface water bodies, particularly streams,
lakes, marshes and bogs, within a given region. The area covered by open water
is readily delineated by various remote- sensing techniques because of the
particular radiation characteristics of water. Decreased reflectivity of soils
moisturized at the surface facilities the delineation of recently flooded areas, if
these are barren. The delineation of floods in vegetation-covered areas is more
difficult, but is possible either by use of radar or through a combination of
radiation and topographic data. Remotely sensed data obtained on flood-plain
characteristics can be combined with data obtained during floods for flood
mapping and delineating flood hazard areas. Characteristics of river channel
such as width, depth, roughness, degree of tortuousity and braiding can also be
obtained from remote-sensing surveys.







11

Ground Water
Ground water is concerned with water in the saturated zones beneath the surface
of the Earth. Ground water information most useful to water resource managers
includes: the presence or absence of ground water in designated areas, the depth
to ground water, the quantity and quality of water available for development,
recharge rates to aquifer, the possible impact of pumping on land subsidence, a
real extent of the aquifer, locations of recharge and discharge areas, and the
interaction between withdrawals at wells and natural discharge into rivers.
Whereas this information is generally sought by hydrogeologists using
conventional methods, remote sensing can help in the planning of conventional
measurements and can be used to estimate some hydrogeological variables
quantitatively and others qualitativelyThe storage capacity of ground water
reservoirs depends on their extent, which depends on geological properties of
the area. Ground water forms the base flow for many streams and is the source
of water for springs and seeps..














12

Applications in Hydrology
Hydrology is the study of water on the Earth's surface, whether flowing above
ground, frozen in ice or snow, or retained by soil. Hydrology is inherently
related to many other applications of remote sensing, particularly forestry,
agriculture and land cover, since water is a vital component in each of these
disciplines. Most hydrological processes are dynamic, not only between years,
but also within and between seasons, and therefore require frequent
observations. Remote sensing offers a synoptic view of the spatial distribution
and dynamics of hydrological phenomena, often unattainable by traditional
ground surveys. Radar has brought a new dimension to hydrological studies
with its active sensing capabilities, allowing the time window of image
acquisition to include inclement weather conditions or seasonal or diurnal
darkness.

Examples of hydrological applications include:
wetlands mapping and monitoring,
soil moisture estimation,
snow pack monitoring / delineation of extent,
measuring snow thickness,
determining snow-water equivalent,
river and lake ice monitoring,
flood mapping and monitoring,
glacier dynamics monitoring (surges, ablation)
river /delta change detection
drainage basin mapping and watershed modelling
irrigation canal leakage detection
irrigation scheduling


13

Applications of Remote Sensing in Weather Forecasting and
Warnings
A- Applications of meteorological satellites
Meteorological satellites are indispensable in weather forecasting and warning
services. Because of their huge areal coverage, meteorological satellite images
can be used to keep. track of weather systems days before they come close to an
area. This is particularly useful in monitoring severe weather systems like
tropical cyclones. The very basic application of meteorological satellite is in
identification of clouds. Clouds can be broadly classified into three categories
according to the cloud base height, namely, low, medium and high clouds.
Some clouds, such as cumulonimbus (a type of thundery clouds), span the three
layers. Different clouds have different characteristics in terms of shape and
pattern and have different tones in the visible and infrared images. These
differences enable the identification of clouds using a combination of the visible
and the infrared images. For instance, fog and low dense clouds are
characterized by their sharp boundary and smooth texture on satellite image.
They appear in bright white to medium gray tone on the visible image, but in
dark to medium gray colour on infrared image. Thundery clouds such as
cumulonimbus, however, contains abundant moisture and extends to great
height. They appear in globular shape and are in very bright tone on both the
visible and infrared images. Apart from identification of clouds, meteorological
satellites are widely used in many areas of applications. Here below are some
examples:
An excellent tool in unravelling volcanic ash beneath clouds. The operating
principle is that volcanic ash and clouds exhibit different characteristics in the
IR1 and IR2 infrared images.


14

Remote sensing application in geomorphology
Geomorphology is the science of study of the landforms of the earth
Geomorphological analysis of surface forms of the earth is a direct form of
interpretation from space images. Aerial photos with required forward overlap
usually provide the third dimension of height, which adds to the precision of
interpretation including morphometry. Geomorphology as a science developed
much later than geology although several aspects of geomorphology are
embedded in geological processes. Geomorphology deals with the genesis of
relief forms of the surface of the earths crust. Certain natural processes are
responsible for the forms of the surface of the earth. A thorough understanding
of various processes leading to landforms is necessary to understand the
environment in which we live. Remote sensing is an effective tool in this
understanding, as aerospace images contain integrated information of all that is
on the ground, the landform, the ecology, the resources contained in the area
and the impact of human actions on the natural landscape. The dynamism with
which changes occur in the landscape is brought out effectively by repeated
coverage of images of the same area at different times. Images convey many
things even to the untrained eye and for a professional it conveys much more
including many features hitherto unknown or unseen on the ground.
Geomorphology - basic concepts The earths surface forms are primarily due to
hypogene or endogenous processes, which include diastrophism, leading to
geologic structure, tectonic activity and volcanism leading to volcanic
landforms. These forms are modified by epigene or exogenous processes, which
include erosion and depositional activities of water, wind and ice. Other
activities include weathering, mass wasting or movement of material by
gravitational action, land-ocean interaction resulting in landforms due to waves,
currents, tides and tsunamis. Climate is another important factor, which has
relevance in shaping of the earths surface because the processes that act upon
the surface material are different in different climatic zones (Van Westen 1994).
15

For example, limestone forms hills in a dry climate whereas in wet climate, it
forms Karst topography with sink holes, caves and caverns predomination.

Remote Sensing applications in Agriculture
Introduction
Agriculture resources are among the most important renewable, dynamic natural
resources. Comprehensive, reliable and timely information on agricultural
resources is very much necessary for a country like India whose mainstay of the
economy is agriculture. Agriculture survey are presently conducted throughtout
the nation in order to gather information and associated statistics on crops,
rangeland, livestock and other related agricultural resources. These information
of data are most importance for the implementation of effective management
decisions at local, panchayat and district levels. In fact, agricultural survey is a
backbone of planning and allocation of the limited resources to different sectors
of the economy.

With increasing population pressure throughout the nation and the concomitant
need for increased agricultural production (food and fiber crops as well as
livestock) there is a definite need for improved management of the nation
agricultural resources. In order to accomplish this, it is first necessary to obtain
reliable data on not only the types, but also the quality, quantity and location of
these resources.


Remote sensing and its Importance in Agricultural survey
Remote sensing is nothing but a means to get the reliable information about an
object without being in physical contact with the object. It is on the observation
of an object by a device separated from it by some distance utilizing the
characteristics response of different objects to emissions in the electromagnetic
16

energy is measured in a number of spectral bands for the purpose of
identification of the object.
In such study single tabular form of data or map data is not sufficient enough
which can provide can be, combined with information's obtained from existing
maps and tabular data.
Remote Sensing techniques using various plate form has provide its
utility in agricultural survey
Satellite data provides the actual synoptic view of large are at a time,
which is not possible from conventional survey methods.
The process of data acquisition and analysis is very fast through
Geographic Information System (GIS) as compared to conventional
methods.
Remote Sensing techniques have a unique capability of recording data in visible
as well as invisible (i.e. ultraviolet, reflected infrared, thermal infrared and
microwave etc.) part of electromagnetic spectrum. Therefore certain
phenomenon, which cannot be seen by human eye, can be observed through
remote sensing techniques i.e. the trees, which are affected by disease, or insect
attack can be detected by remote sensing techniques much before human eyes
see them.

Present system of Generating agricultural data and its Problems
The present system of agricultural data is collected throughout the nation. The
main responsibility of collection agricultural survey lies on the Director of Land
Records, Director of agriculture and District Statistical Office under the
Ministry of Agriculture. These data are collected not only on a local but also
some extent of district and state level. The associate of agricultural survey on
crops (crop production, type of crop and crop yield), range land (condition of
range, forest type, water quality, types of irrigation system and soil
17

characteristics) and livestock (livestock population, sex of animal, types of farm
and distribution of animals).
The basic problems in this survey are;
Reliability of data
Cost and benefits
Timeless
Incomplete sample frame and sample size
Methods of selection
Measurement of area
Non sampling errors
Gap in geographical coverage
Non availability of statistics at disaggregated level.
Remote Sensing techniques make it use before the remote sensing data may
provide solution to these particular problems of agricultural survey.


Advantages of Remote Sensing techniques in Agricultural survey
With the primary aim of improving the present means of generating agricultural
data, a number of specific advantages may result form the use of remote sensing
techniques.
1. Vantage point
Because the agricultural landscape depends upon the sun as a source of
energy, it is exposed to the aerial view and, consequently, is ideally suited
or remote sensing techniques.
2. Coverage
With the use of high-altitude sensor platforms, it is now possible to
record extensive areas on a single image. The advent of high-flying
18

aircraft and satellites, single high quality images covering thousand of
square miles
3. Permanent record
After an image is obtained, it serves as a permanent record of a landscape
at a point in time which agriculture changes can be monitored and
evaluated.
4. Mapping Base
Certain types of remote sensing imagery are, in essence, pictorial maps of
the landscape and after rectification (if needed), allow for precise
measurement (such as field acreages) to be made on the imagery,
obviating time-consuming on the ground surveys. These images may also
aid ground data sampling by serving as a base map for location
agriculture features while in the field, and also as a base for the selection
of ground sampling point or areas.
5. Cost savings
The costs are relatively small when compared with the benefits, which
can be obtained form interpretation of satellite imagery.
6. Real-time capability
The rapidly with which imagery can be obtained and interpreted may help
to eliminate the lock of timeliness which plagues, so many agricultural
survey.
Other advantages of Remote Sensing
Easy data acquisition over inaccessible area.
Data acquisition at different scales and resolutions
The images are analyzed in the laboratory, thus reducing the amount of
fieldwork.
19

Colour composites can be produced from three individual band images,
which provide better details of the area then a single band image or aerial
photograph.
Stereo-satellite data may be used for three-dimensional studies. At
present, all advantages listed above have been demonstrated either
operationally or experimentally:
Application of Remote sensing techniques for Agricultural survey
The specific application of remote sensing techniques can be used for i)
detection ii) identification iii) measurement iv) monitoring of agricultural
phenomena.
Area of specific applications
a) Applicable to crop survey
1. Crop identification
2. Crop acreage
3. Crop vigor
4. Crop density
5. Crop maturity
6. Growth rates
7. Yield forecasting
8. Actual yield
9. Soil fertility
10. Effects of fertilizes
11. Soil toxicity
12. Soil moisture
13. Water quality
14. Irrigation requirement
15. Insect infestations
16. Disease infestations
17. Water availability
18. Location of canals
b) Applicable to range survey
1. Delineation of forest types
2. Condition of range
3. Carrying capacity
4. Forage
5. Time of seasonal change
7. Water quality
8. Soil fertility
9. Soil moisture
10. Insect infestations
11. Wildlife inventory
20

6. Location of water
c) Applicable to livestock survey
1. Cattle population
2. Sheep population
3. Pig population
4. Poultry Population
5. Age sex distribution
6. Distribution of animals
7. Animal behavior
8. Disease identification
9. Types of farm buildings

Application of remote sensing in Seismology
A wide range of satellite methods is applied now in seismology. The first
applications of satellite data for earthquake exploration were initiated in the
70s, when active faults were mapped on satellite images. It was a pure and
simple extrapolation of airphoto geological interpretation methods into space.
The modern embodiment of this method is alignment analysis. Time series of
alignments on the Earth's surface are investigated before and after the
earthquake. A further application of satellite data in seismology is related with
geophysical methods. Electromagnetic methods have about the same long
history of application for seismology. Stable statistical estimations of
ionosphere-lithosphere relation were obtained based on satellite ionozonds. The
most successful current project "DEMETER" shows impressive results. Satellite
thermal infra-red data were applied for earthquake research in the next step.
Numerous results have confirmed previous observations of thermal anomalies
on the Earth's surface prior to earthquakes. A modern trend is the application of
the outgoing long-wave radiation for earthquake research. Spectacular pictures
of co-seismic deformations were presented. Current researches are moving in
the direction of pre-earthquake deformation detection. GPS technology is also
widely used in seismology both for ionosphere sounding and for ground
movement detection. Satellite gravimetry has demonstrated its first very
21

impressive results on the example of the catastrophic Indonesian earthquake in
2004. Relatively new applications of remote sensing for seismology as
atmospheric sounding, gas observations, and cloud analysis are considered as
possible candidates for applications.



Introduction
Remote sensing has been used for earthquake research from the 70s, with the
first appearance of satellite images. First of all it was used in structural
geological and geomorphological research. Active faults and structures were
mapped on the base of satellite images. This method is very limited in time
series analysis. There was no possibility to measure short term processes before
and after the earthquake. It was simple an extrapolation of airphoto geological
interpretation methods into space.
The modern version of this method is active tectonic analysis with the
application of alignment analysis. Time series of alignment distributions on the
Earth's surface are investigated before and after an earthquake.
The current situation of remote sensing application for earthquake research
indicates a few phenomena, related with earthquakes, particularly the Earth's
surface deformation, surface temperature and humidity, atmosphere temperature
and humidity, gas and aerosol content. Both horizontal and vertical
deformations scaled from tens of centimeters to meters are recorded after the
shock. Such deformations are recorded by the Interferometric Synthetic
Aperture Radar (InSAR) technique with confidence. Pre-earthquake
deformations are rather small, on the order of centimeters. A few cases of
deformation mapping before the shock using satellite data are known at present
time. Future developments lay in precision longwave SAR systems with
medium spatial resolution and combined with the GPS technique. There are
22

numerous observations of surface and near surface temperature increases of 35
prior to Earth crust earthquakes. Methods of earthquake prediction are
developing using thermal infrared (TIR) surveys. Multiple evidence of gas and
aerosol content changes before earthquakes are reported for ground
observations. Satellite methods allow one to measure the concentrations of
gases in atmosphere: O3, CH4, CO2, CO, H2S, SO2, HCl and aerosols.
However the spatial resolution and sensitivity of modern systems restricts the
application of satellite gas observation in seismology and the first promising
results have been obtained only for ozone, aerosol and air humidity.
Deformations
One of the main directions of remote sensing application for seismology is
deformation mapping.
Surface deformations in seismic cycles can be divided into three phases: pre-
seismic or inter-seismic, co-seismic and post-seismic ones. Co-seismic
deformations are evaluated up to meters and tens of meters while pre-seismic
movements amount to centimetres. Post-seismic deformations are also
measured in centimetres, but subsequent landslides can increase deformations to
meters. Most current research is focused on co-seismic and post-seismic
(landslide) deformations
Discussion
A wide spectrum of satellite remote sensing methods are applied in seismology
nowadays. The value of these methods for earthquake research is varied.
Optical methods have limited applications, mostly for rapid assessment of
damages in an epicentral zone. Other applications such as alignment analysis
and cloud form analysis related with earthquakes do not have an adequate
scientific basis for seismological application. Vigorous extension of InSAR
methods applications in seismology is observed now. Modern radar systems in
conjunction with GPS/GLONASS will provide whole seismic cycle monitoring.
Broad application of InSAR methods is limited by the high data cost and
23

complex data analysis. Thermal satellite data applications are developing in two
directions at the moment: thermal anomalies in seismic fault research and
emitted longwave radiation measurements in seismic zones. Thermal anomalies
research in seismic faults is developing in the direction of seismic activity
monitoring and close integration with ground observations. Emitted longwave
radiation observations demonstrate promising results, but data accumulation is
required. The nature of ongoing longwave radiation anomalies remains unclear.
Some common remarks on satellite data application in seismology can be made:
(1) The level of automatic data processing is insufficient. There is still too much
manual labour and author arbitrariness in data processingthis concerns both
exotic earthquake cloud analysis and high technique radar methods. Some
results are irreproducible. (2) There is a weak physical and geological basis for
many of the proposed methods. The nature and driving forces of some
phenomena need clarification and connection with current understanding of
physics and geology.

Conclusions
This review of modern remote sensing techniques in seismology demonstrates
the following:
(1) remote sensing methods are being broadly used for earthquake research; (2)
a wide spectra of remote sensing methods are appliedfrom optical sensors to
radar systems; (3) the list of parameters studied by remote sensing are: surface
deformation (both vertical and horizontal), surface temperature, various heat
fluxes on the Earths and top clouds surfaces and some others; (4) future
development of remote sensing application for earthquakes related with new
directions: L-band radar systems, highresolution microwave radiometers, gas
analyzers; (5) we will probably again approach an epoch of belief in
earthquake prediction, where remote sensing can play a key role due to its
global scope, calibration, and automatic data processing.
24

The described processes in the ionosphere, atmosphere, hydrosphere and
lithosphere associated with earthquakes represent the fundamental science issue
of lithosphere-atmosphere-ionosphere coupling. The solution of this problem is
quite far away. We can mention specifically the problems of the nature of
thermal anomalies, the nature of emitted longwave radiation anomalies,
ionosphere-lithosphere coupling and so on. All these issues interface with the
problem of understanding the nature of earthquakes
Applications of remote sensing in minerals
The search for metals and materials needed to sustain our culture has been
carried out since primitive man has searched for flint to craft hand tools. Today,
the materials needed to drive our economic and technological growth are just as
crucial. Most of the easily accessible metal ores were discovered decades ago;
and thus the search has turned to more subtle deposits and more remote
locations.
Since the inception of rudimentary aerial photography at the turn of the
twentieth century, remote sensing has been used as a tool in the search for
economic mineral deposits. As the level of technology has improved, the value
of remotely sensed data has increased. The page will highlight the history and
implementations of remote sensing on mineral exploration today.
Introduction
The value of remote sensing data to mineral exploration has evolved and
increased as technology has improved. In the early days of aerial photography,
aerial photos were used when available to evaluate topography and plan
prospecting and sampling forays. After World War II, the analysis of aerial
photo data became much more sophisticated and actual geological data began to
be extracted. The use of stereoscopic pairs enabled geologist to interpret subtle
structural features. Nonetheless, the primary use of remotely gathered data was
25

comparative. If a particular type of deposit was being mined in a district, aerial
photos would be used to locate similar features elsewhere within the district.
This trends of comparative photography continued until well into the satellite
age when satellite imagery became commercially available. The availability of
multi-spectral, radar, and IR imaging, in variety of combinations allowed
geologists to evaluate regions in much more detail then ever. In addition, the
multiple flyovers allowed a prospect to be viewed in different light during
different seasons. This greatly reduced the cost of regional exploration by
precluding the need for repeated trips to a locale to reassess. Another advantage
was the ability to gather data through cloud and surface cover with radar
imagery. This allowed data to be collected from the tropics and arid regions that
had previously been inhospitable to large regional field exploration. The
computer age further enhanced the usefulness of data by allowing imagery to be
digitally enhanced to highlight specific features. Now spectral studies can be
done which allow the identification of specific minerals from space. The most
elementary operation of remote sensing in mineral exploration is using aerial
photographs to identify topographic surface features which may imply the
subsurface geology. Such telling surface features as differential erosion,
outcropping rock, drainage patterns, and folds/faults can be identified. These
features can be compared to other potential targets in the region when looking
for similar deposits. Faults fractures and contacts often provide a conduit or
depositional environment for hydrothermal or magmatic fluids in regions of
known mineralization, and thus make excellent targets for further investigation


1

Remote Sensing Systems
LIDAR
LIDAR or LADAR {Light(Laser) Detection And Ranging} is an optical
remote sensing technology that can measure the distance to, or other properties
of a target by illuminating the target with light, often using pulses from a laser.
LIDAR technology has application in archaeology, geography, geology,
geomorphology, seismology, forestry, remote sensing and atmospheric physics.
General description
LIDAR uses ultraviolet, visible, or near infrared light to image objects and can
be used with a wide range of targets, including non-metallic objects, rocks, rain,
chemical compounds, aerosols, clouds and even single molecules. A narrow
laser beam can be used to map physical features with very high resolution.
LIDAR has been used extensively for atmospheric research and meteorology.
Downward-looking LIDAR instruments fitted to aircraft and satellites are used
for surveying and mapping.
Wavelengths in a range from about 10 micrometers to the UV (250 nm) are
used to suit the target. Typically light is reflected via backscattering. Different
types of scattering are used for different LIDAR applications, most common are
Rayleigh scattering, Mie scattering and Raman scattering as well as
fluorescence.
Suitable combinations of wavelengths can allow for remote mapping of
atmospheric contents by looking for wavelength-dependent changes in the
intensity of the returned signal.

2

Basic of LIDAR System:
LIDAR system involves a laser range finder reflected by a rotating mirror (top).
The laser is scanned around the scene being digitised, in one or two dimensions
gathering distance measurements at specified angle intervals.




In general there are two kinds of LIDAR detection schema: i- incoherent: as
direct energy detection (which is principally an amplitude measurement) and ii-
Coherent detection: (which is best for doppler, or phase sensitive
measurements). Coherent systems generally use Optical detection which being
more sensitive than direct detection allows them to operate a much lower power
but at the expense of more complex transceiver requirements.
In both coherent and incoherent LIDAR, there are two types of pulse models:
micropulse lidar systems and high energy systems. Micropulse systems have
developed as a result of the ever increasing amount of computer power available
combined with advances in laser technology. They use considerably less energy
in the laser, typically on the order of one microjoule, and are often "eye-safe,"
meaning they can be used without safety precautions. High-power systems are
common in atmospheric research, where they are widely used for measuring
many atmospheric parameters: the height, layering and densities of clouds,
cloud particle properties (extinction coefficient, backscatter coefficient,
depolarization), temperature, pressure, wind, humidity, trace gas concentration
(ozone, methane, nitrous oxide, etc.).
3

Major components of LIDAR system:
1. Laser: 6001000 nm lasers are most common for non-scientific applications.
They are inexpensive but since they can be focused and easily absorbed by the
eye the maximum power is limited by the need to make them eye-safe. A
common alternative 1550 nm lasers are eye-safe at much higher power levels
since this wavelength is not focused by the eye, but the detector technology is
less advanced and so these wavelengths are generally used at longer ranges
and lower accuracies. They are also used for military applications as 1550 nm
is not visible in night vision goggles unlike the shorter 1000 nm infrared laser.
Airborne topographic mapping LIDAR generally use 1064 nm diode pumped
YAG lasers, while bathymetric systems generally use 532 nm frequency
doubled diode pumped YAG lasers because 532 nm penetrates water with
much less attenuation than does 1064 nm. Laser settings include the laser
repetition rate (which controls the data collection speed). Pulse length is
generally an attribute of the laser cavity length, the number of passes required
through the gain material (YAG, YLF, etc.), and Q-switch speed. Better target
resolution is achieved with shorter pulses, provided the LIDAR receiver
detectors and electronics have sufficient bandwidth.
2. Scanner and optics:- How fast images can be developed is also affected by
the speed at which it can be scanned into the system. There are several
options to scan the azimuth and elevation, including dual oscillating plane
mirrors, a combination with a polygon mirror, a dual axis scanner. Optic
choices affect the angular resolution and range that can be detected. A hole
mirror or a beam splitter are options to collect a return signal.
3. Photodetector and receiver electronics: - Two main photodetector
technologies are used in LIDAR: solid state photodetectors, such as silicon
avalanche photodiodes, or photomultipliers. The sensitivity of the receiver is
another parameter that has to be balanced in a LIDAR design.
4

4. Position and navigation systems: - LIDAR sensors that are mounted on
mobile platforms such as airplanes or satellites require instrumentation to
determine the absolute position and orientation of the sensor. Such devices
generally include a Global Positioning System receiver and an Inertial
Measurement Unit (IMU).
Applications of LIDAR
This LIDAR-equipped mobile robot uses its
LIDAR to construct a map and avoid obstacles.
1-Agriculture
Agricultural Research Service scientists have developed a way to incorporate
LIDAR with yield rates on agricultural fields. This technology will help farmers
direct their resources toward the high-yield sections of their land.
LIDAR also can be used to help farmers determine which areas of their fields to
apply costly fertilizer. LIDAR can create a topological map of the fields and
reveals the slopes and sun exposure of the farm land. Researchers at the
Agricultural Research Service blended this topological information with the
farm lands yield results from previous years. This technology is valuable to
farmers because it indicates which areas to apply the expensive fertilizers to
achieve the highest crop yield.
2-Archaeology
LIDAR has many applications in the field of archaeology including aiding in
the planning of field campaigns, mapping features beneath forest canopy, and
providing an overview of broad, continuous features that may be
indistinguishable on the ground. LIDAR can also provide archaeologists with
the ability to create high-resolution digital elevation models of archaeological
5

sites that can reveal micro-topography that are otherwise hidden by vegetation.
LIDAR-derived products can be easily integrated into a Geographic Information
System (GIS) for analysis and interpretation. With LIDAR the ability to
produce high-resolution datasets quickly and relatively cheaply can be an
advantage. Beyond efficiency, its ability to penetrate forest canopy has led to
the discovery of features that were not distinguishable through traditional geo-
spatial methods and are difficult to reach through field surveys.
3-Biology and conservation
LIDAR has also found many applications in forestry. Canopy heights, biomass
measurements, and leaf area can all be studied using airborne LIDAR systems.
Similarly, LIDAR is also used by many industries, including Energy and
Railroad, and the Department of Transportation as a faster way of surveying.
Topographic maps can also be generated readily from LIDAR, including for
recreational use such as in the production of orienteering maps.
In oceanography, LIDAR is used for estimation of phytoplankton fluorescence
and generally biomass in the surface layers of the ocean. Another application is
airborne lidar bathymetry of sea areas too shallow for hydrographic vessels.
4-Geology and soil science
High-resolution digital elevation maps generated by airborne and stationary
LIDAR have led to significant advances in geomorphology, the branch of
geoscience concerned with the origin and evolution of Earth's surface
topography. LIDAR's abilities to detect subtle topographic features such as river
terraces and river channel banks, measure the land surface elevation beneath the
vegetation canopy, better resolve spatial derivatives of elevation, and detect
elevation changes between repeat surveys have enabled many novel studies of
the physical and chemical processes that shape landscapes. Aircraft-based
6

LIDAR and GPS have evolved into an important tool for detecting faults and
measuring uplift. The output of the two technologies can produce extremely
accurate elevation models for terrain that can even measure ground elevation
through trees.
5-Hydrology
LIDAR offers a lot of information to the aquatic sciences. High-resolution
digital elevation maps generated by airborne and stationary LIDAR have led to
significant advances in the field of Hydrology.
6-Meteorology and atmospheric environment
The first LIDAR systems were used for studies of atmospheric composition,
structure, clouds, and aerosols. Initially based on ruby lasers, LIDAR for
meteorological applications was constructed shortly after the invention of the
laser and represent one of the first applications of laser technology.
Elastic backscatter LIDAR is the simplest type of lidar and is typically used for
studies of aerosols and clouds. The backscattered wavelength is identical to the
transmitted wavelength, and the magnitude of the received signal at a given
range depends on the backscatter coefficient of scatterers at that range and the
extinction coefficients of the scatterers along the path to that range. The
extinction coefficient is typically the quantity of interest.
Differential Absorption LIDAR (DIAL) is used for range-resolved
measurements of a particular gas in the atmosphere, such as ozone, carbon
dioxide, or water vapor. The LIDAR transmits two wavelengths: an "on-line"
wavelength that is absorbed by the gas of interest and an off-line wavelength
that is not absorbed. The differential absorption between the two wavelengths is
7

a measure of the concentration of the gas as a function of range. DIAL LIDARs
are essentially dual-wavelength backscatter LIDARS.
Raman LIDAR is also used for measuring the concentration of atmospheric
gases, but can also be used to retrieve aerosol parameters as well. Raman
LIDAR exploits inelastic scattering to single out the gas of interest from all
other atmospheric constituents. A small portion of the energy of the transmitted
light is deposited in the gas during the scattering process, which shifts the
scattered light to a longer wavelength by an amount that is unique to the species
of interest.
Doppler LIDAR is used to measure wind speed along the beam by measuring
the frequency shift of the backscattered light. Scanning LIDARs, have been
used to measure atmospheric wind velocity in a large three dimensional cone.
7-Law enforcement
LIDAR speed guns are used by the police to measure the speed of vehicles for
speed limit enforcement purposes and offer a number of advantages over radar
speed guns.
8-Military
Few military applications are known to be in place and are classified, but a
considerable amount of research is underway in their use for imaging. Higher
resolution systems collect enough detail to identify targets, such as tanks.
Examples of military applications of LIDAR include the Airborne Laser Mine
Detection System (ALMDS) for counter-mine warfare by Arete Associates.
Utilizing LIDAR and THz interferometry wide area raman spectroscopy, it is
possible to detect chemical, nuclear, or biological threats at a great distance.
8

In atmospheric physics, LIDAR is used as a remote detection instrument to
measure densities of certain constituents of the middle and upper atmosphere,
such as potassium, sodium, or molecular nitrogen and oxygen. These
measurements can be used to calculate temperatures. LIDAR can also be used
to measure wind speed and to provide information about vertical distribution of
the aerosol particles. In nuclear fusion research facility, LIDAR Thomson
Scattering is used to determine Electron Density and Temperature profiles of
the plasma.
9-Robotics
LIDAR technology is being used in Robotics for the perception of the
environment as well as object classification. Refer to the Military section above
for further examples.
10-Transportation
LIDAR has been used in Adaptive Cruise Control (ACC) systems for
automobiles. Systems use a lidar device mounted on the front of the vehicle,
such as the bumper, to monitor the distance between the vehicle and any vehicle
in front of it. In the event the vehicle in front slows down or is too close, the
ACC applies the brakes to slow the vehicle. When the road ahead is clear, the
ACC allows the vehicle to accelerate to a speed preset by the driver. Refer to
the Military section above for further examples.

You might also like