Chapter 3
Chapter 3
Chapter Three:
2
Introduction
Natural images can be degraded due to:
◦ Lighting condition,
◦ Sensor resolution and quality,
◦ Limitation or noise of optical system.
3
Cont…
The principal objective of image enhancement is to process a given
image so that the result is more suitable than the original image for a
specific application.
The enhancement:
◦ Does not increase the inherent information content of the data.
But it increases the dynamic range of the chosen features so that they
can be detected easily.
Image enhancement sharpens image features such as edges,
boundaries and contrast.
4
Enhancement-Challenge
5
Image Enhancement Methods
Image enhancement methods can be:
◦ Spatial Domain Methods: techniques are based on direct manipulation of
pixels in an image.
6
Spatial Domain Methods
Spatial domain refers to the image plane itself, and image processing
methods are based on direct manipulation of pixels in an image.
7
Spatial Domain Methods
Two principal categories of spatial processing are:
8
Basics of Intensity Transformations
9
Basics of Intensity
According to the operations on the image pixels it can be further
divided into 2 categories:
◦ Point operations
◦ Mask (Spatial) operations (including linear and non-linear operations).
10
Point Operations
Point operations perform a modification of the pixel values without
changing the size, geometry, or local structure of the image.
Each new pixel value b = I(u, v) depends exclusively on the previous value a
= I(u, v) at the same position.
The original pixel values a are mapped to the new values b by some given
function f:
11
Cont…
New pixel intensity depends on
◦ Pixel’s previous intensity I(u,v)
◦ Mapping function f( )
Independent of
◦ Pixel’s location (u,v)
12
Cont…
The new grey level (color) value in a spatial location (m,n) in the
resulting image depends only on:
◦ The grey level (color) in the same spatial location (m,n) in the original
image.
13
Cont…
Operation is called “global” or “homogeneous”.
Examples of homogeneous point operations (simple nonlinear intensity
transformation) include, among others:
◦ Thresholding
◦ Grey level transformation
Log transformation
Power law transformation
Identity Function,
Piecewise linear transformation ,
◦ etc.
14
Point Operation: Examples
15
Point Operations
Thresholding
◦ Is a limited case of contrast stretching, it produces a two-level (binary)
image.
◦ Convert grey level images into binary image (binaraization )
16
Some Basic Intensity Transformation Functions
Three basic types of intensity transformation functions used frequently
in image processing:
3 most common gray level transformation:
◦ Linear Functions:
Negative Transformation
Identity Transformation
◦ Logarithmic Functions:
Log Transformation
Inverse-log Transformation
◦ Power-Law Functions:
nth power transformation
nth root transformation
17
Cont…
Some Basic Intensity Transformation Functions:
◦ Linear Functions:
Negative Transformation
Identity Transformation
◦ Logarithmic Functions:
Log Transformation
Inverse-log Transformation Reading Assignment
◦ Power-Law Functions:
nth power transformation
nth root transformation
18
Contrast Stretching
19
Contrast Stretching
The effect of applying the transformation to every pixel of f to generate
the corresponding pixels g.
Would produce higher contrast than the original image, by:
Darkening the levels below m in the original image
Brightening the levels above m in the original image
Contrast stretching.
(a) Piecewise linear transformation
function.
(b) A low-contrast electron
microscope image of pollen,
magnified 700 times.
(c) Result of contrast stretching.
(d) Result of thresholding.
20
Enhancement Techniques: Contrast Stretching
• Examples of image enhancement operations:
- Noise removal;
- Geometric distortion correction;
- Edge enhancement;
- Contrast enhancement;
- Image zooming;
- Image subtraction
21
Enhancement Techniques: Contrast Stretching
Improves the contrast in an image by ‘stretching’ the range of intensity
values it contains to span a desired range of values.
Uses:
◦ Multiplying each input pixel intensity value with a constant scalar.
22
Histogram Processing
Histograms are graphical representations of the frequency distribution
of pixel intensities in an image.
Histograms plots how many times (frequency) each intensity value in
image occurs.
Histograms of images describe the frequency of the intensity values
that occur in an image.
• In the dark image that the
most populated histogram
bins are concentrated on
the lower (dark) end of the
intensity scale.
• The most populated bins of
the light image are biased
toward the higher end of
the scale.
23
Histogram Processing
Each single histogram entry is defined as:
h(i) = the number of pixels in I with the intensity value i, for all 0 ≤ i < K. More
formally stated
Therefore, h(0) is the number of pixels with the value 0, h(1) the
number of pixels with the value 1, and so forth.
24
Histogram Processing
25
Histogram Processing
Different images can have same histogram
Images below have same histogram
26
Histogram Processing
Interpreting Histograms
◦ A histogram depicts problems that originate during image acquisition
Involving contrast and dynamic range
27
Histogram Processing
Interpreting Histograms
◦ In image acquisition high-value peaks is representative of an improperly
exposed image
28
Histogram Processing
Interpreting Histograms
◦ Contrast: the range of intensity values effectively used within a given
image:
Difference between the image’s max and min pixel values.
◦ The contrast of a grayscale image indicates how easily objects in the
image can be distinguished
High contrast image: many distinct intensity values
Low contrast: image uses few intensity values
Good Contrast has widely spread intensity values
◦ Large difference between min and max intensity values
29
Histogram Processing
Interpreting Histograms
◦ Dynamic range: the number of distinct pixel values in an image.
It refers to the range of intensity values present in the image
◦ A high dynamic range suffer less image-quality degradation during image
processing and compression.
◦ Not possible to increase dynamic range after image acquisition in a practical way
◦ While interpolation or other processing techniques can be used to
stretch or manipulate the intensity values within the existing dynamic
range.
These methods do not actually increase the amount of information or detail
captured by the sensor.
Instead, they redistribute or interpolate existing data to enhance the
appearance of contrast or detail.
30
Histogram Processing
High dynamic range is always beneficial for subsequent image processing or
archiving.
31
Histogram Processing
Image defects
◦ Histograms can be used to detect a wide range of image defects that
originate either during image acquisition or as the result of later image
processing.
Saturation
◦ Ideally the contrast range of a sensor, such as that used in a camera,
should be greater than the range of the intensity of the light that it
receives from a scene.
32
Histogram Processing
Color image histograms
33
Histogram Processing
In General:
Histogram processing is a technique in image processing that involves
manipulating the histogram of an image.
The histogram of an image represents the distribution of pixel intensities,
with the x-axis representing the intensity values and the y-axis
representing the frequency of occurrence of each intensity value in the
image.
Histogram processing techniques are used to enhance the contrast,
brightness, and overall appearance of images by redistributing pixel
intensities in the histogram.
34
Histogram Equalization
One of the most commonly used methods for contrast enhancement.
A popular image enhancement technique for better contrast and
appearance of images.
Histogram equalization generates image with a consistent histogram by
finding a grey scale transformation function.
35
Histogram Equalization
Spreading out the frequencies in an image (or equalizing the image) is a
simple way to improve dark or washed out images.
Can be expressed as a transformation of histogram:
36
Histogram Equalization
37
Histogram Equalization
Based on information that can be extracted directly from a given image,
without the need for any parameter specifications
38
Histogram Equalization
40
Image enhancement: Spatial Filtering
Spatial filtering modifies an image by replacing the value of each pixel by
a function of the values of the pixel and its neighbors.
41
Filtering in Spatial Domain
Applications:
◦ Noise Reduction: Filtering out noise from images to improve their quality
and clarity.
◦ Sharpening: Enhancing the edges and details in images to make them
appear sharper and more defined.
◦ Smoothing: Blurring or smoothing images to reduce noise or unwanted
detail.
◦ Edge Detection: Highlighting edges and boundaries between objects in
images
Filtering types
◦ Linear filtering
◦ Non-linear filtering
42
Linear Spatial Filtering
Linear filters apply a linear combination of pixel values within a
neighborhood to compute the output value for each pixel.
Examples smoothing filters (such as Gaussian blur) and edge-
enhancement filters (such as Sobel).
Linear filters are typically used for tasks like noise reduction, blurring,
sharpening, and edge detection.
A linear spatial filter performs a sum-of-products operation between an
image f and a filter kernel w.
The kernel is an array:
◦ Whose size defines the neighborhood of operation,
◦ Whose coefficients determine the nature of the filter
Other terms used to refer to a spatial filter kernel are mask, template,
and window.
43
Smoothing (Lowpass) Spatial Filters
Smoothing (also called averaging) spatial filters are used to reduce sharp
transitions in intensity.
Smoothing is used to reduce irrelevant detail in an image
◦ Irrelevant refers to pixel regions that are small with respect to the size of
the filter kernel.
Smoothing filters are used in combination with other techniques for
image enhancement
• Fundamentally, an averaging filter is a low-pass filter.
44
Cont…
Averaging filter:
◦ Basic idea: replace each pixel by the average of the pixels in a square
window surrounding the pixel.
◦ General case: For an n x n averaging filter,
◦ Trade-off between noise removal and detail preserving
Larger window -> can remove noise more effectively, but also blur the
details/edges
◦ Example: for a 3 x 3 averaging filter:
Extends the idea of “moving average” for images.
45
Example: 3x3 average
46
Cont…
47
Cont…
Weighted Averaging Filter
◦ Instead of averaging all the pixel values in the window,
give the closer-by pixels higher weighting, and far-
away pixels lower weighting.
48
Example
49
Cont…
Box Filter
The simplest, separable lowpass filter kernel is the box kernel, whose
coefficients have the same value
The name “box kernel” comes from a constant kernel resembling a box
An mxn box filter is an array of 1’s, with a normalizing constant in front,
whose value is 1 divided by the sum of the values of the coefficients
This normalization:
◦ The average value of an area of constant intensity would equal that
intensity in the filtered image
◦ Normalizing the kernel in this way prevents introducing a bias during
filtering
50
Cont…
51
Cont…
52
Cont…
53
Cont…
Box filters are suitable for quick experimentation and they often yield
smoothing results that are visually acceptable.
◦ Box filters have limitations that make them poor choices in many
applications
◦ Another limitation is the fact that box filters favor blurring along
perpendicular directions
54
Gaussian Filter for smoothing
What if we want nearest neighboring pixels to have
the most influence on the output?
0 0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0 0
0 0 0 90 90 90 90 90 0 0
0 0 0 90 90 90 90 90 0 0 1 2 1
0 0 0 90 90 90 90 90 0 0
2 4 2
0 0 0 90 0 90 90 90 0 0
1 2 1
0 0 0 90 90 90 90 90 0 0
0 0 0 0 0 0 0 0 0 0
0 0 90 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0 0
55
Order-Statistic (Nonlinear) Filters
Order-statistic filters are nonlinear spatial filters whose response is
based on ordering (ranking) the pixels contained in the region
encompassed by the filter.
Smoothing is achieved by replacing the value of the center pixel with
the value determined by the ranking result
The best-known filter in this category is the median filter, replaces the
value of the center pixel by the median of the intensity values in the
neighborhood of that pixel
Median filters provide excellent noise reduction capabilities for certain
types of random noise
◦ impulse noise (sometimes called salt-and pepper noise)
56
Cont…
Median Filter: It replaces each pixel's value with the median
value of its neighboring pixels.
◦ Effective for removing salt-and-pepper noise while preserving
edges.
Max and Min Filters: Replaces each pixel's value with the
maximum or minimum value in its neighborhood.
◦ Useful for morphological operations like dilation and erosion.
57
Median Filter
58
Median filter
Salt and
Median
pepper
filtered
noise
59
Sharpening (Highpass) Spatial Filters
A high-pass filter can be used to make an image appear sharper.
These filter emphasize fine details in the image.
High-pass filtering works in exactly the same way as low-pass filtering; it
just uses a different convolution kernel.
Image sharpening:
◦ Sharpening highlights transitions in intensity
◦ Enhance line structures of other details in an image.
◦ Thus, the enhanced image contains the original image with the line structures
and edges in the image emphasized.
◦ Sharpening is often referred to as highpass filtering since high frequencies
(which are responsible for fine details) are passed, while low frequencies are
attenuated or rejected.
60
Sharpening (Highpass) Spatial Filters
61
Combining Spatial Enhancement Methods
Reading Assignment
62
63