Remote-Sensing-Lecture-Notes Wow
Remote-Sensing-Lecture-Notes Wow
Remote Sensing
The measurement or acquisition of information of some property of an
object or phenomenon, by a recording device that is not in physical or
intimate contact with the object or phenomenon under study”
Lack of contact with object
Sensors
o Collection of data about an object/area/phenomena
Image
Image interpretation
o Art and science
Simplified Information Flow
Sun->atmosphere->target->atmosphere->sensor->image-
>interpretation>application
Remote Sensing (tool)
Remote sensing is a tool or technique similar to mathematics. Using
sensors to measure the amount of electromagnetic radiation (EMR)
exiting an object or geographic area from a distance and then extracting
valuable information from the data using mathematically and statistically
based algorithms is a scientific activity. It functions in harmony with other
spatial data- collection techniques or tools of the mapping sciences,
including cartography and GIS
In Situ vs. Remote Sensing
Both attempt to observe and/or measure objects and phenomena
In-situ
Physical contact (interaction)
Instruments for direct measure
Ground-reference vs. “Ground truth”
Sensing
Data is collected by sensor not in contact with the object
Passive-collection of reflected or emitted energy
Active-Generates signal and collects backscatter from interaction
with surface
Distance
No singe standard
Platforms for sensors operate at multiple levels
Cranes, balloons, aircraft, satellite, UAV’s
o Cranes- to look at a surface overtime
o UAV- Control over where you can take a picture, no death
risks
Permits near-surface to global scale data collection
Data Collect
Image data
2-D (Spatial perspective)
You can tell when and where the image was taken
o most satellites are taken at 10am because clouds form from
convection
Application Areas
Land use/land cover mapping
Photogrammetry: obtaining reliable measurements
Useful for city planning
Natural resource inventory and mapping
Measuring forests
Water quality monitoring
Physical/biological oceanographic mapping
Atmospheric monitoring
Tropical storm development
Many others
Advantages
Different perspective
Obtain data for large areas
In single acquisition- it is efficient
Synoptic
Systematic
Obtain data for inaccessible areas
Military is really interested
Doesn’t affect/interact with phenomena of interest
Disadvantages
Accuracy and consistency
Inconsistent- clouds
Inaccurate-always moving
Artifacts (processing errors)
Scale related
Image is too coarse/detailed
Moving between scales: image + in situ data
High initial outlays for equipment and training
Not many people with the skill
Remote Sensing Process
1. Statement of the problem
Identification of data requirements
2. Data collection
3. Data analysis
Image processing
4. Presentation of information
Maps, charts, statistics, report, graphs, GIS layers
Problem Solving with Remote Sensing
User needs assessment
Remote sensing is a tool, not an end in itself
What are the aims/purpose/goals of the study?
Match technology with user needs
o Operational vs. state-of-the-art system
Established track record
Data accuracy
Data availability
User Needs Assessment- Data Requirements
IF RS is the right approach…
Space/time scales?
What type of sensor?
What type of platform?
What can be user afford?
Scale Considerations
Spatial/Temporal scales and User needs
Spatial scale
Spatial resolution
Temporal scale
Frequency of image acquisition- temporal resolution
o How often do you want that image taken?
What is the application?
Different processes operate at different scales…
o Pollution, Flooding, Fire
Want the images quickly
o Urban development, resource management
Spatial Resolution
Color
You can only see three colors so you can you display three colors
There are multiple bands, but you can only display three colors
True color
Red is red, blue is blue, green is green
Green- reflects green
Resolution- *Four Components of Resolution*
1.Spatial- the size of the field-of-view, e.g. 10x10m
images are a series of numbers
2. Temporal- how often the sensor acquires data, e.g. very 30 days
3. Spectral- the number and size of spectral regions the sensor records data
in, e.g. blue, green, red, near-infrared thermal infrared, microwave (radar)
4. Radiometric- the sensitivity of detectors to small differences in
electromagnetic energy
Spatial Resolution
Indication of how well a sensor records spatial detail
Refers to the size of the smallest possible feature that can be detected as
distinct from its surroundings
Function of platform altitude and IFOV
IFOV (Instantaneous Field of View)
Size of the area that the sensor “sees” at a given moment
Ground Resolution Element (GRE)
Smallest area resolvable on the ground
GRE=IFOV*H
o H= platform altitude
Pixel (picture element)
Image=2-D (square) array of pixel
Pixel size=spatial resolution?
no
Spatial resolution
Spatial resolution ‘germane to task’
Resolution needed for effective detection/analysis of features
observed with sensors
For an object to be detected, its size should be >= 2x’s the pixel size
Spatial resolution is key to picking out one object from the other
Examples: Land cover mapping
Buildings vs. urban areas
Forest vs. trees
Grain and Extent
Grain (spatial resolution): smallest object distinguishable on image
(detail)
o Spatial resolution, similar to pixel size
o High resolution means great detail
Extent (area covered): area covered by an image
Trade-off (in general):
o Small grain size=small area covered (small extent)
More detail you are able to look at, smaller the extent
o Large grain size=large area covered (large extent)
Temporal Resolution
The ability to obtain repeat coverage for an area
Timing is critical for some applications
Crop cycles (planting, maximum greenness, harvest)
Catastrophic events
Aircraft
Potentially high
Satellite
Fixed orbit, systematic collection, pointable sensors
You can go back in time
Limited because of clouds
Spectral Resolution
The number and dimension of the specific EMR wavelength regions to
which sensor is sensitive
Broadband: few, relatively broad bands
Hyper-spectral: many, relatively narrow bands
Radiometric Resolution
Ability of a sensor to distinguish between objects of similar reflectance
Measured in terms of the number of energy levels discriminated
2n, where n=number of ‘bits’ (precision level)
example: 8 bit data=28=256 levels of grey
256 levels =0-255 range
o including zero there are 256 levels
0=black, 255=white
Affects ability to measure properties of objects
Resolution
Trade-offs
Impossible to maximize all four elements
Meteorological satellites acquire image data daily, but at low spatial
and spectral resolutions
LANDSAT TM & ETM+satellite acquires imagery at a higher spatial
and spectral resolutions, but lower temporal resolution
Newer high spatial resolution sensors
Choice of each resolution element depends on goals and objectives
Image Interpretation
Image Interpretation
Act of examining images for the purpose of identifying and measuring
objects and phenomena, and judging their significance
Art vs. science
o Art- “judging their significance”
o Science- “measuring objects and phenomena”
Image Interpretation-Tasks
In order of increasing sophistication:
o Detection, Identification, Measurement, Problem-solving
Not necessarily performed sequentially or in all
Detection
Lowest order
Presence/absence of object or phenomena
Examples; buildings, water, roads and vegetation
Identification
More advanced than detection
Labeling or typing of the object/phenomena
Tends to occur simultaneously with detection
Examples: houses, pond, highway, grass/trees
Measurement
Quantification of objects/phenomena
Direct physical measurement from the imagery
Examples: inventories (count), Length, area and height of objects
Problem Solving
Most complex task
Uses information acquired in first three tasks to put objects in
assemblages or associations needed for higher-level identification
With experience, recognition becomes more automatic and tasks
become less distinct
Interpreter Requirements
Vision
Stereoscopic acuity
Color vision
Knowledge of application and environment
Pattern recognition skills
Intelligence and motivation
Performance Tests
Standardized tests for screening
Recognition of basic patterns and shape
See in stereo
Normal color vision
Interpretative Elements
Information contained in image data
Scale dependent and interrelated
Elements: Tone/color, texture, size, shape, pattern, shadow,
height, context
Tone/Color
The most basic elements of interpretation
Represents amount of energy returned from an object/location
Amount of energy returned depends on:
o Reflectance of object
o Sensitivity of sensor
Black and White (panchromatic), tone is greyness level ranging
from
o Black (no return) -> white (high return)
Color imagery- differences in color are based on energy returned in
specific wavelength bands
o 3 dimensional tone varying in
intensity- brightness
Hue-dominant wavelength, “the color”
Saturation- purity of color relative to grey
Texture
Characteristic placement and arrangement of repetitions in tone or
color
o Subtle changes in tone in close proximity
Visually impressions of “roughness” or “smoothness”
o Small tonal changes appear smooth (e.g. calm water)
o Abrupt/coarse tonal changes appear rough (e.g., built up
urban areas)
Scale dependent
o Aggregate of characteristics too small to be detected
individually (e.g., tree leaves)
o At higher resolution, objects may be discerned and
arrangement detected as pattern
Size
Size of object can be clue to identification
Measured in 2 Ways:
o Relative: compare objects within an image
e.g., distinguish apartment buildings from single-family
home, distinguish freeways from residential streets
o Absolute: determine actual dimensions of the feature if scale
of imagery is known OR if size of feature is known, the scale
of the image can be determined
Shape
Certain features have characteristic shapes
Top-down view provides different perspective than profile
Cultural features tend to have regular geometric shapes and distinct
boundaries
Natural features tend to have les regular geometric shapes and
fuzzy boundaries
Pattern
Arrangement of related spatial objects
Combines microimage elements of repetition for recognition of
characteristic phenomena
o Creates an object that is bigger than an individual picture.
They groups of pixels grouped together that makes
something.
Examples: drainage networks, orchards, housing patterns, road
networks
Shadow
On imagery indicates absence of direct illumination
Can aid analysis
o Enhances terrain relief, especially in low contrast scene
o Used to determine height of objects
o Can be used to show shape of objects
Hinders analysis if object of interest is obscured
Height
Similar to size and can be used in both relative and absolute sense
Oblique image data can provide relative measure of height, but
absolute measurements are difficult because scale varies
More rigorous determination of height can be made from stereo
imagery
Context
Highest level of cognition, integrates all other elements of
interpretation
o Site- location of an object in relation to its environment
o Association- location of object in proximity to other objects
Combination of individual objects typically found
together that allows an interpreter to make a higher
order assignment
Interpretation Process
Search procedure and interpretation keys
Image information
Location, date, sensor, spatial resolution
Collateral (ancillary) data
General to specific
Overall impression
Major geographic regions- natural vs. cultural
Approximate area of land cover/use
Grid system to describe location of features in image
Identify specific features
Examination of evidence within image to converge to interpretation
Human vs. Automated Approaches
Humans generally utilize higher order elements of interpretation, e.g.
context, size, shape more successfully than computer-qualitative
Computers can quantify multiple levels of image tone more accurately
than human-quantitative
Computers are less biased, but may be less accurate because of limited
ability for higher order interpretations
Training and equipment
Hybrid approaches
Electro-Magnetic Radiation (EMR) Spectrum
EMR as Information Link
Link between surface and sensor
Energy Flow-Radiation
Energy transferred between objects in the form of electromagnetic
waves/particles (light)
Can occur in a vacuum
Two models-Wave and Particle
Wave Theory
Explains EMR transfer as a wave
Waves travel through space at the speed of light: (3x108 ms-1)
o Earth to the moon is 1.28 seconds
o Sun to Earth is 8.5 minutes
Electromagnetic Wave
Two components or fields
E=electrical wave
M=magnetic wave
Wave Properties
Amplitude
Height of the wave crest above the undisturbed position
Related to the amount of energy carried by the wave
Wavelength (lambda)
Distance between successive crests or troughs
Generally measured in micro-meters (μm,10-6 meters) and
designated by lambda (λ)
Frequency
Number of wave forms passing through a given point per unit time
Measured as cycle/second (cycle s-1) or Hertz (Hz)
o Low frequency- longer wavelengths
o High frequency- shorter wavelengths
Sum of Ea+ET+ER=Φiλ=1
Ea+ET+ER vary with
o Target and wavelength
Energy absorbed + energy transmitted + energy reflected=1
o Equals one because of conservation of energy
Nadir
Point directly below aircraft
For a true vertical image: nadir=PP
<=3° from Nadir
Blocks of Aerial Photography compiled into an uncontrolled
photomosaic allows you to put the images together to create a
larger and to ensure that you are covering all of the area
Height (H)
Altitude of the platform (and camera system) above the terrain
Focal Length (f)
Distance from focal point (lens) to film plane
o Focal length calculates the height of the image (f)
Optical axis
o Line from the focal point to the center of scene
o For a perfectly vertical photo, the optical axis, principal point,
and nadir point will line up
Scale
Vertical Air Photo/Flat Terrain
Similar to map scale
o Length of feature on image : Length of feature on ground
o Representative fraction (RF)
It has to be in the same unit
RF=1:10,000 OR 1/10,000
o Can be determined
Knowing actual length of feature visible in image OR
Knowing (H) and (f) AND using the concept of ‘similar
triangle’ (i.e., scale=f/H)
Note About ‘Height’
H=Height about the terrain
o E.g., 200 meters above ground level (AGL)
If height is stated in terms of height above sea level (H’)
o Must know height of terrain (h) and adjust H accordingly
o Height above terrain: H=H’ - h
Flat vs. Variable Terrain
Flat terrain: scale can be determined for entire image
Variable terrain: scale will vary across image
Depending on the amount of variation, an average terrain value
may be used
Image Displacement
Orthographic projection-map
True position
Everything is same throughout the image
Central projection-vertical air photo
Object shifted from true position-displaced
Nadir point-only true position
Away from nadir- camera increasingly ‘sees’ object sides
Relief
Differences in relative elevation of objects in photo
Significant source of image displacement
Relief Displacement
Objects will tend to lean outward, i.e. be radially displaced
The greater the object is from the principal point, the greater the radial
displacement
Farther away the object is from the nadir, farther it will lean.
Example, cooling towers towards the edge of photo show greater radial
displacement
Relief Displacement: Radial distance between an object’s image position
and its true plan (horizontal) position due to differences in object relief
Cause
All objects on a vertical air photo are positioned as though viewed
from the same point
o Camera increasingly ‘sees’ the side of an object the further it
is from nadir- objects appears to ‘lean’
Magnitude
Object height, distance from nadir point, and H
Displacement at nadir=0
Points higher than datum-lean outward
Points below datum-lean inward
Photogrammetry
Art and Science of making accurate measurements by means of aerial
photography
If scale is known or can be calculated
o You can calculate the length, area, perimeter, and height
Height of objects
1. Calculate object Height
Based on relief displacement (d), Distance from principle point (r),
& Height above ground level (H)
h/H=d/r
o h=(dxH)/r
o H= height of aircraft
o h=height of building
172.3
o D=relief displacement you can see in the image
2. Height From Shadow Length
Object height (h) may be computed by measuring its shadow length
o Length of object’s shadow on a horizontal surface is
proportional to its height
o Shadows must fall on open level ground where they are
undistorted and easily measured
o Scale of imagery is known
o Shadow must be cast from true top of object
Based on shadow length (L)
o tan= opposite/adjacent = height (h)/ shadow (L)
h= L * tan
o You need to know the sun angle, scale of imagery, and you
need a shadow
Stereoscopic Viewing
Provides 3rd dimension to air photo interpretation
Identify 3-D form of an object
Stereopairs
Overlapping vertical photographs
Stereoscopes
Used to create synthetic visual response by forcing each eye to look
at different views of same terrain
Gives perception of depth (3-D)
Stereo Viewing
Parallax
Apparent change in relative positions of stationary objects
Caused by change in viewing position
Example- looking out car window (side)
Parallax- Air photo
Caused by taking photographs of the same object from different
positions => relative displacement
Relative Displacement
Forms the basis
Photographic Requirements
Photos must be taken at same altitude, along the same flight line
Overlap and Sidelap
Equipment for viewing in stereo
Stereoscopic Alignment
Alignment of air photos
Lined up along the flight line in order of PP, CPP, CPP, PP in a
straight line
Overlap toward center
Amount of separation varies depending on type of photos
In northern hemisphere, photos are arranged with shadows towards
the observer (south at top)
Stereoscopic Parallax Principle
Change in position of an image of an object from one photograph to the
next caused by the aircraft’s motion is the x-parallax.
All objects in the scene at exactly the same height will have an identical
amount of x parallax
x-parallax is directly related to the elevation of the point above the
terrain
o greater for high points than for low points
Parallax Height Measurement
Stereoscopic-based measurement (calculating Height):
H0 = (H-h) x (dp/(P+dp))
Remote Sensing Platforms
Types of Platforms
Ground Based
Hand-held/cranes
Airborne
Stationary
o Captive/tethered balloons
Aircraft
o Manned and unmanned
Satellite
Lighter-than-air
Free floating balloons
o Stable, but restricted by meteorological influences
o Used to acquire meteorological/atmospheric data
Blimps/dirigibles
o Major role-news media/advertisers
Helicopters
Can pin-point locations
Lack stability (vibration)
Aircraft
Platform most often used to acquire aerial imagery
Requirements:
o Requisite speed
o High rate of climb
o Stability in flight
o Unobstructed view for navigation and identification of
landmarks
o Range commensurate with size of project
o Ceiling higher than highest altitude specified
o Capable of remaining in air long enough to take advantage of
suitable photographic time
o Can accommodate equipment
Low altitude Aircraft
o Generally operate below 20,000 ft.
o Most widely used are single engine or light twin engine
o Imagery can be obtained by shooting out the window or
placing camera mount on window or base of aircraft
o Suitable for obtaining image data for small areas (large scale)
High Altitude Aircraft
o Operate above 20,000 ft
o Includes jet aircraft with good rate of climb, maximum speed
and high operating ceiling
o Stable
o Acquire imagery for large areas (smaller scale)
o E.g., NHAP, NAPP, AVIRIS
Advantages of Aircraft
o Acquire imagery under suitable weather conditions
o Control platform variables such as altitude
o Time of coverage can be controlled
o Easy to mobilize
Disadvantages of Aircraft
o Expensive- primarily cost of aircraft
o Less stable than spacecraft
Drift off course
Motion blurring
NAPP- National Aerial Photography Program
o Standardized set of cloud-free aerial photographs covering
the conterminous U.S. (not including Hawaii or Alaska)
Five-to seven year cycles since 1987
Altitude of 20,000 feet
o Most recent and consistent source of high-quality aerial
photography
o Each photo is of a
Centered on one-quarter section 7.5 minute USGS
quadrangle
Covers approximately a 5.5 x 5.5 mile area.
o Available in black and white (B/W) or color infrared (CIR)
Green, red, infrared because blue scatters, which will
blur the image.
Only the vegetation looks different
Spacecraft
Numerous programs
Manned and unmanned systems
Range
o Range for spacecraft is determined by orbit, which is fixed in
altitude and inclination
Sun synchronous-near polar; cross equator at
approximately same local time each day
Geostationary-fixed orbit over equator; primarily
meteorological systems
As the earth rotates, the satellite rotates with it.
Advantages of Spacecraft
o Stable
o Constant altitude and attitude
Attitude- up, down, back and forth, side to side
o Generally insignificant relief displacement
Insignificant because it is so high up
o Repetitive coverage
Useful for looking back in time
Disadvantages of Spacecraft
o Expensive to launch
o Support programs are expensive
o Cannot control time of imaging
o Atmospheric effects may be greater than for aircraft platform
It’s higher, so it has to go through more atmosphere
Photographic Concepts
Exposure: total amount of light striking film and/or Charge Couple
Devices (CCD)
Shutter speed should be fast, but able to let in a lot of light in
Film/CCD exposure primarily a function of :
Scene brightness
Diameter of lens opening
Exposure time
Relative Aperture
Diameter of lens opening determined by aperture setting =f/stop
Defined as ratio of lens focal length (f) to lens opening diameter (d)
Focal length also affects how much light is coming through
Important for speed of the lens
Examples:
f=50mm, d=12.5mm => f/stop = 50/12.5 = f/4
f=50mm, d=25mm => f/stop =50/25 = f/2
F/stop, diameter and exposure
Focal length generally fixed, so f/stop varies as function of the diameter
of the aperture
Inverse relationship
smaller f/stop -> wider lens opening and higher exposure
larger f/stop -> smaller lens opening and less exposure
Shutter Speed
Interval of time the aperture is open
Slow shutter speed- blurry
Fast shutter speed- more clear
Expressed as, e.g., 1/500 sec.
To main constant exposure:
Field of View
Want one fixed field of view
Greater the camera lens angle of view, the greater amount of terrain
photographed at a constant altitude above ground level
Aerial Cameras –Frame Camera
High geometric image quality
Characteristics include:
Film rolls 100-500 ft in length
Large film format (9 in x 9 in, 23 cm x 23 cm)
Low f/stop- fast shutter speed
Film- can only be used to look at 3 portions of the electromagnetic
spectrum
o Two major
True color or color infrared
Remote Sensing Platforms II
Aerial Support Hardware
Used to improve quality of imagery by
Reducing effect of platform motion
Keeping attitude constant
Image motion compensator
Moves film in same direction as aircraft at speed proportional to
aircraft velocity
Gyro Stabilization
Stabilizes camera within plane to keep it pointing at nadir
Adjusts orientation of camera if attitude of plane shifts
Aerial Cameras- Digital
4-chip camera
Use 4 separate full-frame CCD’s
Each sensitive to different wavelength
Single chip camera
Uses single full-frame CCD
Filter is placed over each pixel to capture red/green/blue
information
Color Theory
Primary colors
Red, green, and blue
o They make cyan, magenta, and yellow
Combination of all three colors make white
Color characteristics
Hue- dominant λ (color)
Saturation- purity of color
Lightness- light/dark
Additive process: based on ‘light’
TV, Computer Monitors
o Each color gun’s intensity in each pixel is modulated based on
the amount of primary color present in the scene
Subtractive process:
Based on ‘pigments/dyes’
Cyan, magenta, yellow
Filters
Most aerial photography is collected with filters on the camera lens
Block certain wavelengths of light from reaching film plane and exposing
film
Subtracts some of the light reflected from the scene
Example
Yellow filter: absorbs blue (haze)
o Passes green and red
Image Contrast Enhancement
Image Contrast Enhancement
Many materials, both natural and man-made, have similar reflectance
through the electromagnetic spectrum
Sensors
o Detectors must be able to sense a wide range of values-snow
to dark volcanic basalt- without becoming saturated
o For sensors with high radiometric resolution a decrease in
contrast may be required if the image DN range exceeds that
of the display
Utilize full range of video display capabilities
o Imagine 8 bit display – 256 gray scale layers
o Contrast enhancement is done for display purposes only
Does not affect actual pixel or DN values
Contrast Enhancement
Selection of a contrast enhancement algorithm depends on:
Sensor radiometric resolution
Nature of the original (raw) histogram
Elements of the scene of greatest interests to user
Linear Contrast Enhancement
Min-max Contrast Stretch
DNout=((DNin - DNmin / DNmax - DNmin)) x Quant
Makes full use of range of output device
Quant is the range of brightness values that can be displayed (i.e., 255)
Works best with data that have Gaussian (normal) distribution
Expands the image DN range to fill the dynamic range of the display
device (e.g., 0-255)
Sensitive to outliers (single pixels that may be atypical and outside the
normal DN range)
Percentage Linear Stretch
Uses specified Min and Max values that lie in a certain percentage
of pixels from the mean of the histogram
A standard deviation from the histogram mean is often used –
standard deviation stretch
Nonlinear Contrast Enhancement
Histogram Equalization
Assign an approximately equal number of pixels to each of user-
specified gray scale cases
Greatest contrast is applied to most populated range of DN’s. So,
least contrast applied to TAILS of histograms.
o Determine number of output classes and approximate
cumulative percent
o Determine frequency of each unique DN value
o Determine cumulative probability for each DN value
o Using cumulative probability of the DN, assign it to output
whose cumulative percent most closely matches the
cumulative probability
Gaussian
Transforms histogram to a Gaussian (bell-shaped) distribution
High and low ends of the distribution tend to be strongly enhanced
Intermediate DN values change relatively little
Log Stretch
Maximize contrast in dark part of the histogram
Inverse Log Stretch
Maximize contrast in brightest part of the histogram
Multi Spectral Systems
Terminology
Pixel
One array of numbers per band
Abbreviation of “picture element”
Smallest 2-D unit of an image
Value
o Brightness value” (BV)
o “Digital Number” (DN)
Location (x,y)
o Column # (x)
o Row # (y)
Quantization
Conversion of electrical signal to digital number
Radiometric resolution of signal
Typically in range of 8-12 bits
8-bit data range from 0-255
12-bit data range from 0-1023
ASPRS Guide to Land Imaging Systems
Civil land imaging satellites- resolution >= 59 meters
Optical large number in orbit, over 50 countries
o Two major resolution groups
20 high resolution systems (0.5 to 1.8 meter)
24 mid resolution systems (2.0 to 39 meter)
Radar about 10 in orbit, 18 countries
Coverage capabilities
Hi-res swaths- 8 to 28 km
Mid-res swaths – 70 to 185 km
A few privately funded systems in orbit
US and Israeli
o Hi-res military market
o RapidEye of Germany
Broad area of applications with a 5 micro-satellite
constellation
Planned European satellites
“Dual Purpose”-data will serve both military and civil users.
o Tasking will be shared has not been revealed
Popular Data in the U.S.
Landsat
Multi-spectral Scanner (MSS)
Thematic Mapper (TM)
Enhanced Thematic Mapper (ETM+)
Operational Land Imager (OLI)
SPOT
ASTER- Advanced Spaceborn Thermal Emission Radiometer
MODIS-MODerate resolution Imaging Sensor
Quickbird
Ikonos
Landsat System
Landsat Program
Longest running program
Multi-spectral image data from space
Focus on land resource applications
Satellite
Weight 5000 lbs.
Length 14 ft.
Width 9 ft.
Orbit
Sun-synchronous, near polar
~705 km altitude
9:42 equator crossing
o pictures are taken early morning because clouds
Each orbit ~99 minutes
14 orbits per day
Repeat coverage-every 16 days
Landsat Worldwide Reference System
Location over earth catalogued by WRS path/row
Each scene covers 185 km (wide) by 170 km (long)
Sensors
Return Beam Vidicom (RBV): Landsat 1-3
Multispectrla Scanner (MSS): Landsat 1-5
Thematic Mapper (TM): Landsat 4&5
Enhanced Thematic Mapper + (ETM+): Landsat 6&7
OLI-Operational Land Imager – Landsat 8
LDCM Landsat Data Continuity mission- launched February 11,
2013
Landsat- Multispectral Scanner (MSS)
Longest collection of data by Landsat (1-5)
Optical-mechanical scanning system
Oscillating mirror scans across scene with width of 185km
Light from scene is reflected through focus optics
Filters separate beam into separate wavelength bands
Spectral/Radiometric Properties
Spectral resolution
o 0.5-0.6μm green
o 0.6-0.7μm red
o 0.7-0.8μm near infrared
o 0.8-1.1μm near infrared
Radiometric resolution
o Landsat 1-3: 6 bit
o Landsat 4-5: 8 bit
IFOV is 79 meters
Pixel size: 56 x 79 m
Pixel size and spatial resolution are not the same
Sapling time
Temporal Resolution
Landsat 1-3: 1-18 days
Land 4-5: 16 days
Routine operation ceased in 1992
Landsat- Thematic mapper (TM)
Introduced on Landsat 4 (1982)
Improvement over MSS:
Spectral-extended spectral region- visible, NIR, mid-IR and thermal
o 7 bands vs. 5
Spatial- 30m vs. 79m (120m for thermal)
Radiometric- 8-bit vs. 6-bit
Temporal- 16 day (Landsat 1-3, 18 day)
Landsat TM 4 & 5
Sensor characteristics
Type of detector
(Different material for different wavelengths)
VIS-NIR
o Silicon
MIR
o Indium antimonide
TIR
o Mercury-cadium-telluride
Spectral sensitivity
1. 0.45-0.52 μm |Blue| water penetration, cultural features, smoke
plumes, atmospheric haze
2. 0.52-0.63 μm |Green| Measure peak green reflectance, cultural
features
3. 0.63-0.69 μm |Red| Chlorophyll absorption region, plant species
differentiation, reduced atmospheric effects
4. 0.76-0.9 μm |NIR| Vegetation type, biomass, land/water
boundary discrimination
5. 1.55-1.75 μm |MIR| moisture content, now vs. cloud
discrimination
7. 2.08-2.35 μm |MIR| Vegetation moisture content, rocks and
mineral types
6. 10.4-12.5 μm |TIR| Thermal mapping and vegetation stress, soil
moisture
Landsat 7- Enhanced Thematic Mapper +
ETM+ introduced on Landsat 7 (1999)
Similar to Landsat TM 4 & 5
Optical bands (1-5 & 7)
30m spatial resolution (bands 1-5 & 7)
Temporal resolution (16 day)
Radiometric resolution – 8 bit
Thermal band 6 improved from 120m to 60m
New 15m Panchromatic band added (band 8)
0.52 to 0.9 μm
Scan line corrector failed in May 31, 2003
Landsat 8- OLI
Launched February 2013
Operational Land Imager
Geometric Corrections
where does the image show up?
Image Restoration
Remove/correct imagery for internally and externally caused distortions
and degradations
Preprocessing to avoid effects of distortions in later processing
results
Two major types
Radiometric
o Errors in the digital number values
Internal and external
Geometric (x,y coordinates)
o Distortions in the image
Internal geometric errors
Nonsystematic
Geometric Corrections
Process of Rectification, georeferencing, and registration
Removal of systematic distortions
o Requires model of systematic distortions
o Requires platform ephemeris data (altitude and attitude)
Usually done by vendors
After Systematic Corrections TM data may be off from 5
to 10 pixels due to topography
o Still not planimetric
Removal of unsystematic distortions
Systematic Distortions
Earth rotation
Scan skew
Platform velocity
Perspective
In general systematic distortions:
Corrected using data from platform ephemeris and knowledge of
internal sensor distortion
o Information from the sensor itself
Commercial vendors
Earth Rotation
Fixed orbits
Earth rotates underneath
Skews the geometry of the imagery collected
o Worse at the poles
Sensor time to scan/dwell and during that time earth rotates
End of frame, the point is further west than when imaging began
Bottom of images must be offset to west
Systematic Geometric Distortions
Scan Time Skew
Time to scan a swath, but sensor is moving forward
o Scan line is advanced compared to beginning of the scan
Unsystematic Distortions
Altitude
Topography
Attitude
Aircraft/spacecraft movement
o Images get distorted when the aircraft or spacecraft moves
Requires rectification using ground control points to remove
Platform Velocity, attitude, or altitude variation
Velocity and altitude
o Affects scale
Attitude causes distortion
Compression
Expansion
Unsystematic Corrections
Image to Map rectification
Ground reference data (GPS or a map)
Image to Image registration
Time series
Corrections of Unsystematic errors
Steps (i.e., image to map coordinates, image to image)
o Select Ground Control Points (GCP)
GCP-location on the surface of the Earth (e.g., a road
intersection) that can be identified on the imagery and
located accurately on a map
Well-defined, spatially small
Well-distributed across image
Entire scene
Number depends on the order of fit used
o Compute and test coordinate transformation
o Generate new output image
Selecting GCP’s
On the ground
Have the image in hand
Differentially Corrected GPS
Problems/limitations
o Clouds affect GPS
o Expensive to keep people on the ground
o Find the locations (one of the hardest)
Need the locations to be spread out
Geometric Correction
Compute and test coordinate transformation
Order of Fit
o GCP’s matching :
Where it is actually is and where it thinks it is
RMSE(Root Mean Square Error)
o Difference between input (source) location of a GCP and
retransformed location for the same GCP
Typically want less than half a pixel
Linear Transforms (1st Order)
Create a whole new image
Create a new grid
Generating New Output Image
Calculate Pixel Values in New Image
Intensity Interpolation
1) Nearest Neighbor
2) Bilinear Interpolation
3) Cubic Convolution
1) Nearest Neighbor Resampling
Retains original values
Suitable for thematic data
Easiest
May create stair-step appearance, non continuous linear features
2) Bilinear Interpolation
Calculates new pixel value by interpolating brightness values in two
orthogonal directions weighted distance
Smoother, more spatially accurate
Used when changing cell size
Cannot be used with thematic data
3) Similar to Bilinear interpolation but uses the 16 closest pixel values
Best match to original image statistics
o Better to do just once because they more you do, the you are
messing with the image
Depending on data may sharpen or smooth noise
Most computationally intensive
GIS-RS Integration
Introduction
Objective
To summarize the main components and functional attributes of
GIS
Explain how remotely sensed data may be integrated in a GIS or
GIS data may be applied to assist with image processing operations
GIS
A system to encode, store, manage, manipulate, transform,
analyze, and display spatial data from the real world for a specific
application
o Very poor with time
Inter-relationships
Remote sensing- GIS- Cartography
GIS Components and Functionality
GIS data capture
Data entry: digitizing, scanning, GPS data collection, and derived
products (remote sensing)
Geographic Data
o X,Y location coordinates
o Z- non-location attributes
o 4 types of data
points, lines, polygons, and surfaces
Data structure
o Vector
Cartesian Coordinates
Topological
o Raster
Traditional raster (grid)
Point-pixel
Not the size of point, but the size of the
pixel
Line- string of pixels connected together
Polygon- contiguous group of the same type of
pixels
Two Ways to Visualize Data of the World
Raster- Grid
“Pixels”
a location and value
satellite images and aerial photos are already in this format
Vector-line
Points, lines, and polygons
“features” (house, lake, etc.)
o Attributes
Size, type, length, etc.
GIS Raster datasets
Traditional grid/matrix based
Interchangeable with remotely sensed data
o Standard image data structure (.GEOTIFF)
Image with X,Y locations
Examples
o USGS LULC (Land-use and land cover) @ 1:250,000
o USGS DEM (Digital Elevation Model)
Conversions between types
Vector to Raster
o Most straightforward and repeatable
o If a pixel size is small transformation can look very similar
Bigger the pixels, poorer they look
Raster to Vector
o Can be problematic
o Grid size
Integration of Remotely Sensed Data and GIS Data
Data need to be geometrically rectified and registered to a suitable
projection and coordinate system
Time as an attribute
Single date, multi-temporal or change image
Useful in change detection
Compatible File Formats
Vector raster
Generating GIS Layers with Remote Sensing
Generating GIS Layers with RS/IP
“heads-up digitizing”
o Majority Filter
Changes the minority color into the majority color
Can’t have mixed polygons
Classification results
Updating GIS layers with RS/IP
Currency
o Provides coverage over large spatial areas quickly
Editing
o Use old map and update that with new remotely sensed data
Merging GIS and Image Processing
GIS data in support of image correction
Radiometric
o Can be used to find invariant features
Geometric
o Can use GCP’s from a GIS to register image and GIS layer
GIS data in support of image classification
Visual integration via overlay
Incorporating GIS Data into a Classification
Purpose- to improve classification results
To aid in cluster busting (i.e., confused classes)
Types of data
Elevation
o Slope, aspect
Geology
Soils
Political boundaries
Vegetation map
Incorporating GIS data in Class
Possible uses of data-
Delimiting Study Area (.e., park boundary)
Potential training sites (i.e., supervised classification)
Geographical stratification
o Elevation, slope, soil type, vegetation type
As an additional band
Reference data set for error analysis
DEM Overlay
Correct topography
Differences in height
Raster (grid)
GIS Operations with Grid based data
Boolean Model overlay
o Adding numbers together
o Numbers- land cover type
Hope that something numerically comes out correct
Area Calculations Overlay
o Add numbers and multiply it by resolutions
Search Radius Aggregations
GIS Operations with Remotely Sensed Data
Remotely sensed data processed or analyzed by:
Attribute Re-coding
o Changing the values of the data (i.e., thematic outputs)
Changing the number to whatever you want it to be
For comparing different data
The number represents a theme or class
Re-scaling
o Changing the resolution (degrading, MMU)
MMU- Minimum Mapping Unit
Smallest thing will represent on a map
Weighting
o Making values more or less important
GIS Algebra
o Treats map values a variables that can be transformed or
combined to create new layers
Focal, zonal, regional operations
Compute each locations’ new value as a function of the existing
value within a specified distance
Majority Filers, Clump, Sieve
o Majority Filter
Majority is used because average classified data does
not mean anything
o Clump
Looks in all direction of red pixels and groups them
Called clump 1
Starts in upper-left hand corner
Even though it’s the same class (red), it is a different
clump
This is why you recode back
It goes to the next row
Goes from black to white because it starts from the
upper rows
Better for vector data
o Sieve
MMU-smallest unit you can map
They won’t have a classification (0/unclassified)
Incorporated to the pixels around it
Overlay Models
Compare data sets
Biophysical Remote Sensing Vegetation
Biophysical remote Sensing
Application of physical principles and methods to biological problems
Estimating or inferring information about:
Environmental structures (e.g., plants, landforms);
Processes (e.g. evapotranspiration, plant-growth, CO2 flux)
o Related to photosynthesis
9 Fundamental variables
o (x,y) location, topographic-bathymetric (x,y,z) elevation,
color spectral signature of features, soil moisture, surface
temperature, surface roughness, vegetation – chlorophyll
absorption (photosynthesis), biomass (how much vegetation),
and moisture content
Spectral Signature of Vegetation
70% of the Earth’s land surface is covered by vegetation
sunlight strikes the plant leaves chlorophyll, strongly absorbs visible
light (from 0.4 to 0.7 m) and is used in photosynthesis
cell structure of the leaves strongly reflects near-infrared light
(from 0.7 to 1.1 m).
The more leaves a plant has, the more these wavelengths of light
are affected, respectively.
Use this information for modeling
Yellow and red indicates stress from vegetation. Red is more stressed
than yellow and brown means dead.
Using this Information
Variations in Red absorption and NIR reflectance
Indicator of health of vegetation
Use this information to characterize vegetation over time and space
Spectral vegetation indices
o Dimensionless, radiometric measures that function as
indicators of relative abundance and activity of green
vegetation
Spectral Vegetation Indices
Reduces the spectral signature to a single, quantitative, value
Generally based on red-NIR bands and differences in response
Most ratio types are related, (same information can be derived)
Simple Ratio (NIR/RED)
Use NIR and Red because it is higher in the NIR and absorbing in
the red
High value => lots of vegetation
Very low Value => water
o Absorbs NIR
Varies with
o Types of vegetation
o Amount of vegetation
o Radiometric resolution
NDVI mostly widely used
(NIR-Red)/(NIR+Red)
allows you to have a standardized value that doesn’t matter about
radiometric resolution
> 0.1 indicates vegetation & < 0.1 indicates no vegetation
o It has to be positive because NDVI would indicate that red is
greater than NIR.
Black and white because it just 1 band
Others
Relationship of SVI’s to Biophysical Parameters
Relative measure of abundance, health, stress
Veg/no veg masks
Percent cover
Empirically related to biophysical variables:
Biomass
Leaf-area-index (total leaf area, one side/unit ground area)
o All leaves that project into a column of space about an area
on the ground are measured
PAR-Photosynthetically Active Radiation
MODIS/Terra Leaf Area Index
Not very detailed but takes images everyday
Max range
o 0-10
Related to total biomass of vegetation
Influenced by the amount of reflecting soil between plaints
Using Biophysical Data
Must be preprocessed
Geometric and Radiometrically corrected
Continuous surfaces
Everything thing has temperatures, so it is everywhere
Radiance/Reflectance values used directly as input to biophysical models
Multiple scales (leaf, plant, forest)
Biophysical Applications
Global change issues
Landscape pattern and process
Distribution and areal extent of terrestrial ecosystems
Measure biomass density of terrestrial ecosystems
Vegetation change
Measure change in distribution, abundance and diversity of
vegetation
Biogeochemical cycling processes
Cycling of carbon
Quantify knowledge of production and decomposition
*Calculate GPP (Gross Primary Production)*
Temporal Characteristics of Vegetation
Timing is very important when investigating vegetation
Phenological (growth) cycle
Crop cycles
o Different crops grow at different times
Seasonal
AVHRR NDVI composites
More generally, seasonal change appears each year with the “greening”
The leafing of trees results in whole regions becoming dominated by
active vegetation
Change Detection Mapping
Multi-Temporal Analysis
Time can provide additional information
Change detection- why useful?
Allows you to go back in time
Cities, agriculture, forest, water
Generally refers to change in land cover over time
Use it to monitor crops throughout the year
Can be applied to other phenomena
Steps- Change Detection
1. Define study area
2. Determine temporal scale for change
3. Select appropriate classification system
4. Minimize effect of environmental considerations
5. Acquire image and ancillary data
6. Preprocess data- geometric and radiometric registration
Important because images have to be similar
Images aren’t lined up, they show up twice
7. Select change detection algorithm
8. Compute area and type of change
9. Assess accuracy
Change Detection Considerations
Step 3: Select classification scheme
Classes compatible with remote sensed data
Standardized scheme
Example: USGS Anderson Land Use/land Cover
o USGS – Level 1 Categories
Suitable for use with coarse resolution satellite imagery-
MODIS, AVHRR
Step 4: Resolution Characteristics
Should be consistent across all dates of imagery
o Want to maintain resolutions
o Change detection measures changes on the ground, they are
not errors in the data
Temporal
o Anniversary date imagery
Limits effects of illumination differences
Limits effects of seasonal and phonological differences
Spatial
o Same pixel size
o Accurate spatial registration (RMSE <=0.5 pixel)
Spectral
o Same bands or best approximation
Radiometric
o Same radiometric resolution
Step 5: Minimize environmental impacts:
Atmosphere:
o Cloud cover
o Haze, thin clouds, humidity can alter spectral signatures
o Anniversary date imagery can minimize seasonal weather
variation
Soil Moisture
o Conditions should be same for both dates
o May be necessary to review rainfall levels, especially prior to
collection
o Imagery may be stratified to adjust for localized effects
Phenology
o Terrestrial and aquatic ecosystems
o Man-made development cycles
Urban-Suburban
o States: undeveloped -> landscaping
o Some classes may be spectrally similar
Change Detection Algorithms
Band overlay using images from 2 dates
Place each image in different color plane of image display
Provides a visual display of changes
o Find areas that have changed
Limitations
o Not quantitative
o No “from-to” change class information
Good for calculating the area of change
Multi-date Composite Image
Rectify images
Create single data set (composite)
Detect change
o Classification of all bands
o Areas of change will form their own spectral classes
PCA (Principal Component Analysis)
Regression Line
1 less than total input bands
orthogonal to the previous one
May be difficult to label change classes
Can provide quantitative values of area changed
Change Detection Algorithms –Image Algebra
Band Ratioing
Ratio same spectral band for 2 different image dates
o Value near one, high or low?
Smaller than one or greater than one means change
Image Differencing
Subtract same spectral band for 2 different dates of images
o Values near zero, high or low?
Both techniques require setting change/no change thresholds
Doesn’t provide information on type of change
Change Detection Algorithms
Post- classification comparison
Classify images from both dates
Change is determined from classified images
o Calculate number of pixels that changed and get the area of
change
Provides “from-to” information
Dependent on accuracy of classification and any errors will impact
change detection
o Both pre and post
Ancillary Data as Source for Date 1
Use existing land cover data in place of remotely sensed image for
one date
Classify image for second date and compare
Depends on quality of classification and ancillary data
Provides “from-to” classification information
Manual on-screen digitizing
Use standard photo interpretation techniques
Use linked images
Analyst digitizes changes on screen
Ex: open source mapping
Right memory Insertion
Qualify the area change
Values that had change will come out in red or blue and values that
had no change will come out in purple
Gives you visual area of change
Accuracy of all other things
“from-to” and quantifies the area
Thermal Infrared (TIR) Remote Sensing
Introduction
Thermal infrared energy is emitted from all objects that have a
temperature greater than absolute zero
Everything we encounter emits thermal infrared electromagnetic
radiation
We sense thermal energy primarily through touch
Reflective infrared (0.7-3.0 m)
Thermal infrared energy (3.0-14 m)
Detectors that are sensitive to thermal infrared radiation
Current detector – mercury-doped germanium (Ge: Hg), indium
antimonide (InSb)
Cooling
Kinetic Heat, Temperature, Radiant Energy and Radiant Flux
Kinetic heat
Energy of particles of molecular matter in random motion
o Measured using a thermometer (in contact with the object)
Object’s internal kinetic heat is also converted to radiant energy
Radiant flux ()
Electromagnetic radiation exiting an object
Concentration of the amount of radiant flux exiting (emitted from)
an object is its radiant temperature (Trad)
Generally there is a high, positive correlation between the true kinetic
temperature of an object (Tkin) and the amount of radiant flux radiated
from the object (Trad)
Radiometers placed some distance from the object to measure its
radiant temperature
This is the basis of thermal infrared remote sensing
The relationship is not perfect, Trad always being slightly less than
the Tkin of the object.
o How different is dependence on the emissivity
Different objects have different emissivity
Emissivity
Emissivity, , is the ratio between the radiant flux exiting a real-world
selective radiating body (Fr) and a blackbody at the same temperature
(Fb):
=Fr/Fb
o what it actually does over the black body
Blackbody is theoretical, so emissivity is from 0 to 1
All selectively radiating bodies have emissivities ranging from 0 to <1
Values can fluctuate depending upon the wavelengths of energy
being considered
o Graybody- outputs a constant emissivity that is less than one
at all wavelengths
Because emissivity is constant it is classed a graybody
The Emissivity of an object may be influenced by a number factors,
including:
Color
Surface roughness
Moisture content
Compaction
Field-of-view
Wavelength
Viewing angle
Metals objects tend to have lower emissivity and have very different
radiometric and kinetic temperatures
Radiation Properties of a Surface
From the principle of the conservation of energy
Reflectance
Absorptance
Transmittance
Reflectance + Absorptance + transmittance = 1
Kirchoff’s Radiation Law
In the infrared portion of the spectrum the spectral emissivity of an
object generally equals its spectral absorptance, i.e. () ~ ()
“good absorbers are good emitters and good reflectors are poor
emitters”
Most real-world materials are usually opaque to the thermal radiation
(i.e., no radiant flux exits fro the other side of the object)
Therefore, we may assume transmittance, ()=0. Substituting emissivity
for absorptance and removing transmittance from the equation yields:
1=() + ()
Using this simple relationship
Because the terrain does not lose any incident energy to transmittance
All of the energy leaving the object must be accounted for by the
inverse relationship between reflectance (()) and emissivity
(()).
If reflectivity increases then emissivity must decrease. If emissivity
increases then reflectivity must decrease
Relationship of Kinetic and Radiant Temperature
Using Kirchoff’s Radiation Law and the Stephen Boltzmann Law
Thermal infrared remote sensing systems generally record the apparent
radiant temperature, Td of the terrain rather than the true kinetic
temperature, Tkin
Influence of Emissivity
TIR Radiance At Sensor
Signal Received Impacted by
Energy Emitted by the surface
o Amount of energy emitted functions as emissivity and kinetic
temperature
Energy Reflected off the surface
Energy Emitted by the atmosphere (path)
Atmospheric Transmittance
Sensor field-of-view
Atmospheric Effects
Gases and Aerosols
Can emit or absorb energy
Thermal sensors can be biased as much as 2C when acquired at
altitudes as low as 300m
Cloud Effects
Generally blocks thermal radiation
o Large, continuous clouds
o Thin clouds (cirrus)
o Patch clouds (cumulus elements)
Thermal Infrared Atmospheric Windows
Reflective infrared region from 0.7-3.0 m and the thermal infrared
region from 3-14 m
Regions that pass energy are called atmospheric windows. Regions that
absorb most of the infrared energy are called absorption bands
Basics of TIR Image Interpretation
In general, warmer objects appear brighter
Except meteorological Satellites
Remember-Skin Temperature
Contact with the object
Can sense at night or during the day
Does not require solar energy
Generally, night preferred
Daytime shading
o Topography
o Vegetation
Sensors
Examples of current satellite sensors aloft
NOAA AVHRR- June 1979 to present
o LAC-1.1x1.1 km GAC-4.4km
Local Coverage
Global coverage
o Thermal Bands- 3.55- 3.93m, 10.3-11.3m, 11.5-12.5
NOAA Geostationary Operational Environmental satellite (GOES)
o 8 km every 30 minutes
o monitors weather
Moderate Resolutions Image Sensor (MODIS)
o 1km spatial resolution
o 17 bands in the mid to thermal infrared
o Daily coverage
Advanced Spaceborne Thermal Emission and Reflection
o 30-90 m resolution, Bands 6 MIR, 5 TIR
o surface temperature product
Interpreting TIR Imagery
Thermal Properties
Timing of image acquisition is important
o Things warm up and cools down throughout the day;
temperature changes constantly
Some materials respond to changes in temperature more rapidly
than others
o Water vs. rocks
Temperature of water doesn’t heat up as easily
Night time and daytime collection allows for determining different
surface materials
Applications of TIR Image Data
Marine Science/Climatology
Sea surface temperature mapping
Sea surface circulation
Fisheries
Fire Mapping
Smoke plumes consist of ash particles and other combustion
products so fine that they are penetrated by the relatively long TIR
wavelengths
Fractional Vegetation Cover
Cold background due to permafrost with a warmer vegetation
canopy
Plant is warmer because it is trying to warm itself
Radar
Radar
Radar
Radio Detection and Ranging
o Active microwave
Operating in radio waves
Long wavelengths and not much energy, so they have
to create their own energy (active energy)
Active Systems
Create their own electromagnetic energy
o Long wavelength (3-25 cm)
o Transmit from the sensor to the terrain
o Interacts with the terrain producing a backscatter of energy
Interacts with cloud and rain
Some get affected by cloud and rain, which is
used for weather satellites
Migration of birds
o Energy received back at the sensor is recorded by the remote
sensing receiver
All weather capable
RADAR-Wavelengths
Pulse of electromagnetic radiation sent out by the transmitter through the
antenna is of a specific wavelength and duration (i.e., it has a pulse
length measured in microseconds, sec).
Much longer than visible, near-infrared, mid-infrared, or thermal infrared
energy usually measured in centimeters rather than micrometers
Radar wavelength names (e.g., K, Ka, Ku, X, C, S, L, and P)
Wavelength Frequency Relationship
Longer waves can penetrate different things while short waves get
affected
RADAR Systems
Doppler Radar
Weather
Doppler frequency shifts are a function of the relative velocities of a
sensing system and a reflector
Plan-Position Indicator Radar (PPI)
Air traffic control
o WWII airplanes
Side Looking Airborne Radar (SLAR)
1950’s
Fly alongside a country and “see in”
Synthetic Aperture Radar (SAR)
Aperture means antennae
Most common today
By synthetically creating a long antennae it improves the spatial
resolution
Uses
Declassified in the 1970’s
o Still learning about it
Mapping cloud covered areas- Panama, Amazon
Ocean-wind, ice, wave Land-minerals, floods, snowmelt
o Water has big impact on radars because it changes wave
properties
o Ice changes density
Side Looking Airborne radar (SLAR)
Use SLAR as an introduction
Very similar to Satellite and aircraft SAR systems
Components of a SLAR system
o Pulse Generator
o Transmitter
o Duplexer
Sends and receives the energy
o Antenna
o Recover
o Hard Drive for data storage
Radar Nomeclature
Nadir
Azimuth flight direction
Range or look direction
o Where it is sending its beam
Range (near and far)
o Point closest and farthest
Depression angle
Incidence angle
o Angle at which it hits something
o Depends on terrain and changes the incidence angle
Altitude above-ground-level
Radar Terms
Radar- a lot of variables and changes throughout the scene
Azimuth direction
Line of flight of the aircraft
Range direction
Direction of radar illumination that is at right angles to the direction
of the aircraft/spacecraft
o Significant impact on feature interpretation
Depression Angle
The angle between the horizontal plane and extending out from the
aircraft fuselage and the electromagnetic pulse of energy from the
antenna to a specific point on the ground
o Near Range depression angle
Closest point the aircraft
o Far Range depression angle
Furthest distance that the beam is sent away from the
aircraft
Incident Angle
The angle between the radar pulse of electromagnetic energy and a
line perpendicular to the Earth’s surface where it makes contact
o Flat Terrain- incident angle=complement of the depression
angle (sum of both angles equals 90 degrees)
Radar Logic
Sends out bema of energy
Short burst microseconds (10-6 seconds)
Interacts with object
Receives energy back
By electronically measuring the return time of signal echoes, the
range, or distance between the transmitter and the objects may be
determined
o SR=ct/2
SR- slat range (direct distance between transmitter and
object)
C=speed of light (3x108 m/sec)
T=time between pulse and transmission
2 factor both send and receive
Time to return is used to construct the image-distance
Strength of signal received back related to microwave reflectivity
Spatial Resolution
Ground Resolution cell size
Control two different resolutions
o Range resolution (across track direction)
Farther the range resolution, the better
o Azimuth Resolution (along track direction)
Range resolution is proportional to the length of the microwave
pulse
o The shorter the pulse length, the finer the range resolution
o Pulse length is a function of the speed of light (c) multiplied
by the duration of the transmission (t).
o Also a function of the depression angle (varies with how far
away from the aircraft the object is)
Range Resolution
Range resolution (Rr) at any point between the near and far-range of the
illuminated strip can be computed if the depression angle () of the
sensor at that location and the pulse length () are known. :
Rr= ( x c)/ 2 cos
Azimuth Resolution
Azimuth resolution (Ra) is determined by computing the width of the
terrain strip that is illuminated by the radar beam
Real aperture active microwave radars produce a lobe-shaped beam
which is narrower in the near-range and spreads out in the far-
range
The beam width is inversely proportional to antenna length (L).
Longer the radar antenna, the narrower the beam width and the
higher the azimuth resolution
The azimuth resolution (Ra) can be calculated by using the wavelength,
antenna length, and the slant range distance:
Ra=(S*)/L
o Where S is the slant range distance (i.e. distance from the
aircraft)
o L is the length
o And the wavelength
Influences on the RADAR return
Terrain
Polarization Effects
Surface Roughness Characteristics
Diffuse Reflector
Specular Reflector
Corner Reflector
Electrical Characteristics
Water
Radar System characteristics
Features
Vegetation, soils, water, ice, urban
Terrain Surface Influences on Radar Return
Geometric Characteristics
Radar relief displacement is caused by changes in elevation
Higher an object is the closer it is to the radar antenna
o Energy striking this object is received back at the sensor
sooner
o Leads to distortions in the imagery
Foreshortening
Layover
Shadowing
Foreshortening
All terrain that has a slope inclined toward the radar will appear
compressed or foreshortened relative to slopes inclined away from the
sensor
Look shorter than it actually is
Affected by:
Object height: greater the height of the object-the greater the
foreshortening
Depression angle: greater the depression angle the greater the
foreshortening
Location of objects in the across-track range: features in the near-
range portion of the swath are generally foreshortened more than
identical features in the far-range
In the near-range range, features appear to have steeper slopes than
they actually do
While, slopes appear shallower than they actually are in the image far-
range
Shortens and brightens it
Layover
Image layover is an extreme case of image foreshortening. It occurs
when the incident angle is smaller than the foreslope
Can’t see it
This distortion cannot be corrected even when the surface topography is
known
Shadowing
Radar Shadow
Backslope is a shadow when its slope angle is steeper than the
depression angle
Grazing illumination: the backslope equals the depression angle,
Backslope is just barely illuminated by the incident energy
The backslope is fully illuminated when it is less than the depression
angle
The terrain features (e.g., mountains) with identical heights and fore-and
backslopes may be recorded with entirely different shadows, depending
upon where they are in the across-track.
Feature that casts an extensive shadow in the far-range might have
its backslope completely illuminated in the near-range
Radar shadows occur only in the across-track dimension. Therefore, the
orientation of shadows in a radar image provides information about the
look direction and the location of the near-and far-range.
Polarized Energy
Can send and receive in the same or different polarizations (i.e. like
polarized HH, Cross-polarized HV)
Objects on the ground modify the polarized energy they reflect
Usually terrain returns energy in the same polarization
Vegetation-multiple reflections “volume scattering” of incident energy is
depolarized-comes back varied
Polarization
Cinder Cone
Basalt flow
Imaged at the same time with two different polarizations
Differences are due to the direct reflection of blocks that are large relative
to the wavelength
Different objects show up differently
Surface Roughness
Terrain property that most strongly influences the strength of the radar
backscatter
Surface texture characteristics
Microscale roughness is usually measured in centimeters (i.e. the
height of stones, size of leaves, or length of branches in a tree
o Rough
o Intermediate
o Smooth
The area with smooth surface roughness sends back very little
backscatter toward the antenna- dark in the image
Intermediate surface
0.17 to 0.96 cm
o Grey in the image
Rough surface
H>0.96 cm
o Brighter in the image
Impacts vary with wavelength and depression angle
Reflectors
Diffuse Reflector
Rough surfaces
o Scatter incident energy in all directions and return a
significant portion of the energy to the radar antenna
Specular Reflector
Smooth surfaces
o Reflect most of the energy away from the sensor, resulting in
a very low return signal
Corner Reflector
Very bright response
o Returns energy incident upon it and the surrounding area
o Buildings, bridges, metal objects
o Used for geometric rectification of radar imagery
Bright and very obvious in the image
Electrical Characteristics
Terrain types conducts electricity from the microwave energy from the
radar sensors better
Complex dielectric constant
Ability of a material to conduct electrical energy
Dry surfaces (soil, rock)
o Dielectric constant of 3 to 8 in the microwave portion of the
spectrum
Water
o Diaelectric constant of approximately 80
Most significant parameter affecting the materials’
dielectric constant is its moisture content
Moist soils reflect more radar energy than dry soils
Bare ground soil moisture
Vegetation influences
Ocean high dielectric constant- most energy
reflected
Penetration
Moist surfaces only a few centimeters mostly reflected
Dry surfaces- about equal to the wavelength of the radar system
Vegetation Response
Plant canopies can be thought of as a seasonally dynamic three
dimensional water bearing structure consisting of foliage components
(leaves) and woody components (stems, trunk, stalk, branches)
Active microwave can penetrate the canopy to varying depths
Frequency
Polarization
Incident angle
Received signal varies
How far the signal penetrates the canopy-crown, trunk
Depolarized?
Interacts with the soil surface?
o Surface and canopy
Types of Active Microwabe Surface and Volume Scattering that Take Place
in a Hypothetical Pine Forest Stand
Surface Scattering
o Energy interacts with the leaves and stems
Volume Scattering:
o Scattering from leaves, trunks, branches, etc.
o Depolarizes signal
Ground surface scattering
o Interactions with the ground surface
o Moisture content
Canopy Penetration
Longer the wavelength- the greater the penetration into a canopy
Short wavelengths
Surface scattering
o X band
Longer wavelengths
Surface and volume scattering
o C-Band
Longest Wavelengths
Surface, volume, and ground
o L-band
o P-band, greatest penetration
Urban Area Response
Urban areas are typically light toned in active microwave energy because
of their many corner reflectors
Cardinal effect
o Reflections from urban areas, often laid out according to
cardinal directions of a compass caused significantly larger
returns when features were illuminated at an angle
orthogonal to their orientation.
Water and Ice Response to Active Microwave Remote Sensing
Smooth open water areas act as specular reflectors
Yield no return (i.e. radar signal bounces off)
Rough water surfaces have varying responses depending on wave
height, surface roughness, and sensor wavelength
Oil increases specular reflection (oil spill detection)
Digital Elevation Models: Interferometric Radar
Based on analysis of the phase of the radar signals as received by two
antennas located at different positions in space
Radar signals returning from a point on the earth’s surface will
travel two different slant ranges to the two antennas
Difference of the lengths leads to a different portion of the phase to
be captured by each antenna
If the geometry of the interferometric baseline (i.e. the difference
between A1 and A2) is known with a high degree of accuracy, this
phase difference can be used to compute an interferogram
Digital Elevation Models: Interferometric Radar
Output is a interferogram
Displays the phase difference values for each pixel as acquired by
the 2 antenna
These series of stripes or fringes represent differences in surface
height and sensor position
Once sensor position is removed, each fringe corresponds to a
particular elevation range
Using this a DEM is developed
Shuttle Radar Topography Mission (SRTM)
Joint project-NASA and NIMA-National Imagery and Mapping Agency
(now NGA)
Single shuttle mission in February 2000-11 days
Covered 99% of the land area between 60 North and 56 South
80% of the land area 95% of the people
Used two antenna
60 m apart
primary antenna in the shuttle secondary only received
collected 12 terabytes of data- 15,000 CD-ROMS
Spatial Resolution
30m US (public)
90m outside
Horizontal and Vertical Accuracy
H-20m, V- 16m
Available for free from CGIAR
Final Exam 09/02/2015
problem and how you would solve it . what data to collect,
land cover
how you should solve the problem using remote sensing
what data, how you would process them, analyze them and what
products would you have at the end in order to do that
primary
data acquisition- which data to choose
what image processing steps and
descriptions and how you would do them
problem solving using remote sensing
change detection- resolutions of the sensors, years that they were
available, type of change detection you could do, which bands you can
use to go after that, what do you need to measure, classify the image
can use image algebra
what classes you need, what classifications, etc.
ex:
1985 a bridge was built over a river in bangladesh what is the economic
impact of building that bridge
background
use Landsat data
make sure it is geometrically rectified
classify - what classes
change detection over time