Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
0% found this document useful (0 votes)
33 views

Lecture 1

Uploaded by

kassaye hussien
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
33 views

Lecture 1

Uploaded by

kassaye hussien
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 54

GIS and Remote Sensing:

Preliminary Concepts

Lecture 1
Brainstorming
 Why GIS and RS Matters?
 Explain the importances of GIS and RS in Spatial Analysis
and Modeling.
 Describe the Applications of GIS and RS.
1.1. Why GIS Does Matter?
• Human being has faced many challenges in recent
decades due to climate change risks and other non-
climatic factors such as
 GHGs emission,
 Change in land use/ land cover (deforestation)
 Soil degradation,
 Decline in agricultural productivity,
 Biodiversity loss,
 Ecosystem Services Depletion
 Ground water contamination, and many others…
 All these concerns are geographical components, and
arranged in meaningful way. How?
Why…
 One of the fundamental challenges to plan and make
decisions for those concerns is a lack of understanding
how they change (arranged) in space and time
(spatiotemporal variability).
◦ Because spatiotemporal patterns of phenomena and their drivers
have different characteristics for different locations (spatial
dimension), and they change over time (temporal dimension).
 GIS, thus, provides a spatial framework and point out
the way to plan and make decision for such
geographical questions.
1.2. Applications of GIS and RS
 The integration of GIS with remote sensing data
has been for a wide range of applications,
including:
◦ Landscape characterization (Soil unit, Land use and
land cover classes; morphometric parameters) ;
◦ Vegetation Monitoring (Forest, Grassland, Shrub land
vegetation)
◦ Soil-landscape modeling (Soil depth, carbon
storage…..)
◦ Watershed Hydrology (watershed delineation, stream
network, flow direction, flow accumulation, Soil lose
rate) ;
Cont…
◦ Environmental risk assessment (soil salinity,
invasive species, pollutants (point and non-points)
, underground water contamination…..) ;
◦ Vulnerability and Disaster Risk Management
(Landslide, flood Vulnerability, Drought…);
◦ Agro-meteorological applications ( e.g. Crop
growing condition , crop yield forecasting).
Cont…
 Land Use Planning and Management
 Crime Mapping and Analysis
 Solid Waste Management
 Urban Infrastructure and Utilities
 Urban Transportation
 Spatial Planning
1.3. Remote Sensing (RS)
 Define RS.
 How it works.
 What are the passive and actives sensors?
 What are the essential elements of RS?
 Importances of RS.
1.3. Remote Sensing Method
 Traditional Method (Direct method)
◦ It is time consuming and often too expensive
◦ It has not shown spatial explicit information.
 It is not effective to address large area or to acquire data ( e.g..
vegetation covers)
 Remote Sensing (Indirect Method)
◦ It is less expensive, provide information about inaccessible
area, real time data
• Remote sensing system : Passive and Active System
◦ Passive remote sensing system based on wave Theory
◦ Since objects (such as vegetation) have their unique
spectral features (reflectance or emission regions), they can
be identified from remote sensing imagery according to
their unique spectral characteristics.
Types of Sensors…
RS…
1.4. Passive Remote Sensing
 Electromagnetic Radiation and Principles
◦ Electromagnetic Spectrum (EMS)
◦ Important Wavelength Regions
 Challenges of Radiation
◦ Scattering and absorption
◦ Atmospheric Transmittance windows
 Spectral Reflectance Signature
◦ Broad Bands
◦ Narrow Bands
EMS…
 EMS is the total range of wavelengths (Figure below).
 RS operates in several regions of the EMS.
 The optical part of the EMS refers to that part of the EM
spectrum in which optical phenomena of reflection and
refraction can be used to focus the radiation.
 The optical range extends from X-rays (0.02 µm) through
the visible part of the EM spectrum up to and including far-
infrared (1000 µm).
 The ultraviolet(UV) portion of the spectrum has the shortest
wavelengths that are of practical use for remote sensing.
EMS…
Cont…
• Three windows in the thermal infrared region, namely two
narrow windows around 3 and 5 µm, and a third, relatively
broad, window extending from approximately 8 to 14 µm.
 Because of the presence of atmospheric moisture, strong
absorption bands are found at longer wavelengths. There
is hardly any transmission of energy in the region from
22 µm to 1 mm. The more or less transparent region
beyond 1 mm is the microwave region.
Atmospheric transmission expressed as
percentage.
Spectral Reflectance Curves:
Broad bands for Vegetation, soils and water
Spectral Reflectance Curves:
Narrow bands for similar specie
1.5. Platforms

What are platforms and Sensors? Elucidate.


1.5. Platform and Imaging Sensor
 Remote Sensing (Satellite Platform and Imaging Sensor)
◦ Landsat: [Landsat MSS, TM, ETM+, and OLI]: 1-8 series
◦ SPOT [SPOT HRV, SPOT Pan, SPOT VGT] 1-6
 Multispectral and panchromatic modes
◦ Terra [MODIS, ASTER]
◦ NOAA: [NOAA-AVHRR]
◦ IKONOS :[ Multispectral and panchromatic modes]
◦ EO-Hyperon: [Hyperspectral ]
 Parameters of imaging sensors
◦ Spectral resolution
◦ Radiometric resolution
◦ Spatial resolution
◦ Temporal resolution
1.6. Digital Image Preprocessing

 Explain Image preprocessing and postprocessing.


 Explain the image platform Characteristics.
1.6. Digital Image Pre-processing
 Remotely sensed data that are captured and received by
different imaging sensors on various satellite platforms
contain some errors and deficiencies.
◦ Geometric and Radiometric errors
 The correction of deficiencies and removal of flaws prior to
the main data analysis and extraction of information are
termed as pre-processing methods.
 In general, these methods can be categorized into three
namely,
◦ Image Restoration (Geometric and Radiometric
correction methods)
Cont…
 Radiometric restoration:
◦ is concerned with the fidelity of the readings of electromagnetic
energy by the sensor.
◦ is concerned with issues such as atmospheric haze, sensor
calibration, topographic influences on illumination, system
noise, and so on.
 Geometric restorations:
◦ are concerned with the spatial fidelity of images.
◦ are less of a burden to the general data user because most geometric
restorations are already completed by the imagery distributor.
 All of these rectifications can be achieved using existing
IDRISI Image Processing modules.
 However, IDRISI also provides a series of imagery-specific
modules that streamline many of these operations into a
single click solution.
Cont…
 Radiometric Restoration
a. Sensor Calibration
- concerned with ensuring uniformity of output across the
face of the image, and across time.
b. Radiance Calibration
- Pixel values in satellite imagery typically express the
amount of radiant energy received at the sensor in the
form of uncalibrated relative values simply called
Digital Numbers (DN).
- Sometimes these DN are referred to as the brightness
values.
- For many (perhaps most) applications in remote
sensing (such as classification of a single-date image
using supervised classification), it is not necessary to
convert these values.
Cont…
 However, conversion of DN to absolute radiance
values of relative surface reflectance values is a
necessary procedure for comparative analysis of
several images taken by different sensors (for
example, Landsat-2 versus Landsat-5). Since each
sensor has its own calibration parameters used in
recording the DN values
c. destriping –
─ It involves the calculation of the mean (or median) and
standard deviation for the entire image and then for each
detector separately.
─ It restores Striping or banding which is systematic noise
in an image that results from variation in the response of
the individual detectors used for a particular band.
Pre-processing……..
 Radiometric Normalization
 Remote sensing data can be influenced by a number of
factors such as atmospheric absorption and
scattering.
 Radiometric correction is particularly important ant
when there is a need to detect a genuine land cover
change as revealed by changes in spectral surface
reflectance from multi-date satellite images (Change
Detection Analysis)
 Two Approaches ( Absolute and relative corrections).
 The absolute approaches require the knowledge of the
sensor spectral profile and atmospheric properties at
the time of image acquisition for atmospheric
correction and sensor calibration.
Pre-processing……
i. Absolute Radiometric Correction
 Convert the radiance to reflectance.
◦ While radiance is the variable that is being recorded by
Landsat sensors, conversion of this quantity into reflectance
generates better comparison among different imagery.
◦ This removes differences caused by the position of the sun
and the differing amounts of energy output by the sun in each
band.
 The conversion can be derived using the following
expression:
Pre-processing……..
 One of the most significant radiometric data activities is
the conversion of digital numbers to radiance and
reflectance.
 Convert the DN values to radiance can be calculated for
each band’’
 However, this approach is usually impractical since for
most historical satellites these data are not available.
Pre-processing……..
ii. Relative Radiometric Normalization
◦ This approach is more preferred since it bypass the
shortcomings of the absolute approach.
◦ Despite a number of relative radiometric normalization
methods, linear regression which based on radiometric
control set (RCS) has been commonly used in change
detection studies.
◦ The underlying assumption of this method the image of the
same area taken at different time always contain some
landscape whose reflectance are nearly constant over time.
◦ These objects are then selected to serve as RCS.
MSS and TM Band Combinations

3,2,1

4,3,2

5,4,3
OLI Band Combination...
1.7. Image Classification

 Explain the types of image classification.


1.7. Image Classification
 Generally, the goal of image classification is to produce discrete
categories of cover types/vegetation types rather than focusing
to extract continuous variables accurately.
 It is the process of developing interpreted maps from remotely
sensed images.
 Traditional Image Classification Approaches
◦ Supervised and unsupervised (Rule-based)
 Advanced Image Classification Methods
◦ Tree based –decision or Decision Tree
◦ Artificial neural network (ANN)
◦ Support vector machine (SVM)
◦ Random forest (RF),
Data Formats
 There are Two major Spatial Data Formats
◦ Vector
◦ Raster
◦ Data conversion is also possible (Vector to Raster
or vise versa)
Image Classification….
• Supervised Classification
• In this approach, the analyst selects training areas (pixels
that represent patterns that the analyst recognizes or that he
can identify with help from other sources, such as aerial
photos, ground truth data or maps).
 The analyst defines training areas by identifying regions
on the image that can be clearly matched to areas of
known identity on the image.
 Although this approach is specific areas of known identity
where the analyst is able to detect serious errors in
classification by examining training data, the analyst
imposes a classification structure upon the data.
Image Classification….
 Supervised Classification
 It has Three basic steps :
◦ Training Stage
◦ Classification stage
◦ Output stage
• Training stage: Supervised classification requires that the user
select training areas for use as the basis for classification.
 Training data should be
 Both representative and complete
 All spectral classes constituting each information class must be
adequately represented in the training set statistics used to classify
an image
 e.g. water (turbid or clear)
 e.g. crop (date, type, soil moisture, …)
 It is common to acquire data from 100+ training areas to represent
the spectral variability
Cont…
 The steps for supervised classification may be summarized
as follows:
1. Identify training sites.
2. Digitize polygons around each training site, assigning a
unique identifier to each cover type.
3. Create spectral signatures for each of the cover types.
4. Classification
Image Classification….
 Classification Stages
 Information Classes vs. Spectral Classes
◦ Information classes are categorical, such as crop type, forest type,
tree species, different geologic units or rock types, etc.
◦ Spectral classes are groups of pixels that are uniform (or near-
similar) with respect to their brightness values in the different
spectral channels of the data.
 Classification Algorithm
◦ Parallelepiped,
◦ Minimum Distance,
◦ Mahalanobis Distance,
◦ Maximum Likelihood,
Image Classification….
 Supervised Classifiers
◦ Parallelepiped
 Uses a simple decision rule where The decision boundaries form
an n-dimensional parallelepiped in the image data space.
 The dimensions of the parallelepiped are defined based upon a
standard deviation threshold from the mean of each selected class
◦ Maximum Likelihood
 Assumes that the statistics for each class in each band are
normally distributed.
 Each pixel is assigned to the class that has the highest probability
◦ Minimum Distance
 Uses the mean vectors of each ROI and calculates the Euclidean
distance from each unknown pixel to the mean vector for each
class
◦ Mahalanobis Distance
 A direction sensitive distance classifier that uses statistics for each
class
 Assumes all class covariances are equal and therefore is a faster
method
Image Classification….
Unsupervised Image Classification
 Unsupervised Classification- Categorizing pixels with out
instructions from the operator. I.e. It does not start with a pre-
determined set of classes as in a supervised classification.
 Unsupervised classification techniques share a common intent
to uncover the major land cover classes that exist in the image
without prior knowledge of what they might be.
 Generically, such procedures fall into the realm of cluster
analysis, since they search for clusters of pixels with similar
reflectance characteristics in a multi-band image.
 The analyst then determines the identity of each class by a
combination of experience and ground truth (i.e., visiting the
study area and observing the actual cover types).
Image Classification….
 Unsupervised Classifiers: K-Means or IsoData
◦ K-Means
 Uses a cluster analysis approach which requires the
analyst to select the number of clusters to be located in
the data, arbitrarily locates this number of cluster centers,
then iteratively repositions them until optimal spectral
separability is achieved
 Experiment with different numbers of classes, change
thresholds, standard deviations, and maximum distance
error values to determine their effect on the classification.
◦ Isodata
 Calculates class means evenly distributed in the data
space and then iteratively clusters the remaining pixels
using minimum distance techniques.
 Each iteration recalculates means and reclassifies pixels
with respect to the new means.
Image Classification…..
 Major drawbacks of traditional Methods
◦ It is assumed a pure pixel. In reality, pixels are
composed of a heterogeneous materials.
◦ Since these methods rely on the pixel as the
smallest entity, the integrating measurement of a
given surface portion by a pixel regularly leads to
unacceptable averaging because it ignores within-
pixel heterogeneity
◦ This is particularly true for the highly
heterogeneous landscape patterns where patch sizes
are frequently less than the dimensions of a pixel
(900m2).
1.8. Accuracy Assessment

 Explain accuracy assessment.


1.8 Accuracy Assessment/Validation
 Once the classified image is produced, its quality is assessed by
comparing it to reference data (ground truth).
 This requires the selection of a sampling technique, generation
of an error matrix and calculation of error parameters, which
includes:
◦ Error omission ( Type II Error) = samples that are omitted
in the interpretation result. NIIR/Column Total*100
◦ Error commission (Type I Error) = Numbers of Incorrectly
classified References (Samples). NICS/ROW Total *100
◦ User accuracy = Divide correctly classified pixels in each class by ROW totals
◦ Producer accuracy = Divide correctly classified pixels in each class of truth
data by COLUMN totals

◦ Over all accuracy = the sum of diagonal cell in the error


matrix) divided by the total number of checked.
Classification Accuracy
 How do we tell if classification is any good?
◦ Classification error matrix (aka confusion matrix or
contingency table)
◦ Need “truth” data – sample pixels of known classes
 How many pixels of KNOWN class X are incorrectly
classified as anything other than X (errors of omission)?
So-called Type 2 error, or false negative
 Divide correctly classified pixels in each class of truth
data by COLUMN totals (Producer’s Accuracy)
 How many pixels are incorrectly classified as class X
when they should be some other known class (errors of
commission)? So-called Type 1 error, or false positive
 Divide correctly classified pixels in each class by ROW
totals (User’s Accuracy)
46
Accuracy …
 Error of Omission (EO) refers to those sample points that are
omitted in the interpretation results.
◦ Error of omission starts from the referenced data and relates to the
column in the error matrix.
 Error of Commission (EM) the error of commission starts from
the interpretation results and refers to the row in the error matrix.
◦ The error of commission indicates to incorrectly classified samples.
 User Accuracy (UA) is the probability that a certain reference
class has also been labeled that class.
 Producer accuracy (PC) is probability that a sample point on the map is
that particular sample.
◦ Producer accuracy is the corollary of omission error while user accuracy
is the corollary of commission error.
 Over all accuracy (AC) is the number of correctly classified
pixel (i.e. the sum of diagonal cell in the error matrix) divided by
the total number of checked. Thus, AC gives one figure for the
classified results as a whole.
Error Matrix (confusion matrix or contingency matrix)

Note: a. Forest ; b. Built-up; c. Agriculture; and d. Open/Bare


Cont…
 All non-diagonal elements of the matrix represent error of
omission orcommission.

 From the table you can read that, for example, 53 cases of
A were found in the real world (‘reference’) while the
classification result yields 61 cases of a; in 35 cases they
agree.
1.9. Change detection and Modeling

 Discuss change detection and Modeling.


1.9. Change Detection Analysis
 Another important aspect of remote sensing that is associated
with image classification is its ability to perform change
detection analysis across different periods.
 Since most environmental phenomena constantly change over
time such as vegetation and climate, it is therefore significant
that these changes are accurately accounted and detected.
 This is essential to understand the dynamics of change that had
happened over the years and utilize this information in
predicting future trends and scenarios of the area.
 Humans have been modifying land to obtain food and other
essentials for thousands of years, however, current rates and
intensities are causing unprecedented changes in the ecosystems
and environmental processes at local, regional and global scales.
Methods of Change Detection
 The different procedures proposed in the literature for LULC
change detection are all based on one of the following two
general approaches.
1. A two-point timescale : bi-temporal change détection.
a. post-classification comparaison,
b. Univariate image differencing/rationing,
c. Change vector analysis, etc.,
2. A multi-point timescale (temporal trajectory analysis)
Summary
Major Discussion Points (Class presentation)

1. Discuss change detection and Modeling. (Dereje)


2. Explain the importances of GIS and RS in Spatial Analysis
and Modeling. (Dino)
3. Explain the Importances of RS. (Dino)
4. Explain Image preprocessing and postprocessing. (Dereje)
5. Explain the image platform Characteristics.(Abdunasir)
6. Explain the types of image classification. (Abdunasir)

You might also like