Lect01 Introduction Fundamentals PDF
Lect01 Introduction Fundamentals PDF
Lecture 1
Introduction & Fundamentals
Presented By:
Diwakar Yagyasen
Sr. Lecturer
CS&E, BBDNITM, Lucknow
What is an image?
►a representation, likeness, or imitation of an
object or thing
► a vivid or graphic description
► something introduced to represent
something else
2
DIGITAL IMAGE
3
DIGITAL IMAGE
4
FROM ANALOG TO DIGITAL
Digital
Imaging Sample and storage Digital On-line
systems quantize computer buffer Display output
(disk)
5
Sampling
256x256 64x64
6
Quantisation – 8 bits
0 255 255 0 0
0 0 255 0 0
0 0 255 0 0
0 0 255 0 0
8
Quantisation cont.
9
Coloured Image
10
Intensity (Gray-Level) Image
11
Binary Image
12
Image Processing
manipulation of multidimensional signals
image (photo)
f ( x, y)
video
f ( x, y, t )
CT, MRI
f ( x, y, z, t )
13
► What is Digital Image Processing?
Digital Image
— a two-dimensional function
x and y are spatial coordinates
The amplitude of f is called intensity or gray level at the
point (x, y) f ( x, y)
Pixel
— the elements of a digital image
14
Origins of Digital Image Processing
16
Sources for Images
► Electromagnetic (EM) energy spectrum
► Acoustic
► Ultrasonic
► Electronic
► Synthetic images produced by computer
17
Electromagnetic (EM) energy spectrum
Major uses
Gamma-ray imaging: nuclear medicine and astronomical observations
X-rays: medical diagnostics, industry, and astronomy, etc.
Ultraviolet: lithography, industrial inspection, microscopy, lasers, biological imaging,
and astronomical observations
Visible and infrared bands: light microscopy, astronomy, remote sensing, industry,
and law enforcement
Microwave band: radar
Radio band: medicine (such as MRI) and astronomy
18
Examples: Gama-Ray Imaging
19
Examples: X-Ray Imaging
20
Examples: Ultraviolet Imaging
21
Examples: Light Microscopy Imaging
22
Examples: Visual and Infrared Imaging
23
Examples: Visual and Infrared Imaging
24
Examples: Infrared Satellite Imaging
2003
USA 1993 25
Examples: Infrared Satellite Imaging
26
Examples: Automated Visual Inspection
27
Examples: Automated Visual Inspection
Results of
automated
reading of the
plate content by
the system
28
Example of Radar Image
29
Satellite image
Volcano Kamchatka Peninsula, Russia
30
Satellite image
Volcano in Alaska
31
Medical Images:
MRI of normal brain
32
Medical Images:
X-ray knee
33
Medical Images: Ultrasound
Five-month Foetus (lungs, liver and bowel)
34
Astronomical images
35
Examples: MRI (Radio Band)
36
Examples: Ultrasound Imaging
37
Fundamental Steps in DIP
Extracting image
components
Improving the
appearance Partition an image into
its constituent parts or
objects
Result is more
suitable than Represent image for
the original computer processing
38
Light and EM Spectrum
c E h , h : Planck's constant.
39
Light and EM Spectrum
40
Light and EM Spectrum
► Monochromatic light: void of color
Intensity is the only attribute, from black to white
Monochromatic images are referred to as gray-scale
images
41
Digital Image Fundamentals
► HUMAN Vision
42
Image Acquisition
Transform
illumination
energy into
digital images
43
Image Acquisition Using a Single Sensor
44
Image Acquisition Using Sensor Strips
45
Image Acquisition Process
46
A Simple Image Formation Model
f ( x, y ) i ( x, y ) r ( x , y )
47
Some Typical Ranges of illumination
► Illumination
Lumen — A unit of light flow or luminous flux
Lumen per square meter (lm/m2) — The metric unit of measure
for illuminance of a surface
On a cloudy day, the sun may produce less than 10,000 lm/m2 of
illumination on the surface of the Earth
48
Some Typical Ranges of Reflectance
► Reflectance
Digitizing the
coordinate
values
Digitizing the
amplitude
values
50
Image Sampling and Quantization
51
Representing Digital Images
52
Representing Digital Images
53
Representing Digital Images
54
Representing Digital Images
b=M×N×k
55
Representing Digital Images
56
Spatial and Intensity Resolution
► Spatial resolution
— A measure of the smallest discernible detail in an image
— stated with line pairs per unit distance, dots (pixels) per
unit distance, dots per inch (dpi)
► Intensity resolution
— The smallest discernible change in intensity level
— stated with 8 bits, 12 bits, 16 bits, etc.
57
Spatial and Intensity Resolution
58
Spatial and Intensity Resolution
59
Spatial and Intensity Resolution
60
Image Interpolation
http://www.dpreview.com/learn/?/key=interpolation
61
Image Interpolation:
Nearest Neighbor Interpolation
f1(x2,y2) = f(x1,y1)
f(round(x2), round(y2))
=f(x1,y1)
f1(x3,y3) =
f(round(x3), round(y3))
=f(x1,y1)
62
Image Interpolation:
Bilinear Interpolation
(x,y)
f 2 ( x, y )
(1 a) (1 b) f (l , k ) a (1 b) f (l 1, k )
(1 a) b f (l , k 1) a b f (l 1, k 1)
l floor ( x), k floor ( y ), a x l , b y k .
63
Image Interpolation:
Bicubic Interpolation
► The intensity value assigned to point (x,y) is obtained by
the following equation
3 3
f3 ( x, y ) aij x y i j
i 0 j 0
64
Examples: Interpolation
65
Examples: Interpolation
66
Examples: Interpolation
67
Examples: Interpolation
68
Examples: Interpolation
69
Examples: Interpolation
70
Examples: Interpolation
71
Examples: Interpolation
72
Basic Relationships Between Pixels
► Neighborhood
► Adjacency
► Connectivity
► Paths
73
Basic Relationships Between Pixels
74
Basic Relationships Between Pixels
► Adjacency
Let V be the set of intensity values
75
Basic Relationships Between Pixels
► Adjacency
Let V be the set of intensity values
(ii) q is in the set ND(p) and the set N4(p) ∩ N4(p) has no pixels whose
values are from V.
76
Basic Relationships Between Pixels
► Path
A (digital) path (or curve) from pixel p with coordinates (x0, y0) to pixel
q with coordinates (xn, yn) is a sequence of distinct pixels with
coordinates
We can define 4-, 8-, and m-paths based on the type of adjacency
used.
77
Examples: Adjacency and Path
V = {1, 2}
0 1 1 0 1 1 0 1 1
0 2 0 0 2 0 0 2 0
0 0 1 0 0 1 0 0 1
78
Examples: Adjacency and Path
V = {1, 2}
0 1 1 0 1 1 0 1 1
0 2 0 0 2 0 0 2 0
0 0 1 0 0 1 0 0 1
8-adjacent
79
Examples: Adjacency and Path
V = {1, 2}
0 1 1 0 1 1 0 1 1
0 2 0 0 2 0 0 2 0
0 0 1 0 0 1 0 0 1
8-adjacent m-adjacent
80
Examples: Adjacency and Path
V = {1, 2}
0 1 1
1,1 1,2 1,3 0 1 1 0 1 1
0 2 0
2,1 2,2 2,3 0 2 0 0 2 0
0 0 1
3,1 3,2 3,3 0 0 1 0 0 1
8-adjacent m-adjacent
The 8-path from (1,3) to (3,3): The m-path from (1,3) to (3,3):
(i) (1,3), (1,2), (2,2), (3,3) (1,3), (1,2), (2,2), (3,3)
(ii) (1,3), (2,2), (3,3)
81
Basic Relationships Between Pixels
► Connected in S
Let S represent a subset of pixels in an image. Two pixels
p with coordinates (x0, y0) and q with coordinates (xn, yn)
are said to be connected in S if there exists a path
82
Basic Relationships Between Pixels
83
Basic Relationships Between Pixels
The boundary of the region R is the set of pixels in the region that
have one or more neighbors that are not in R.
If R happens to be an entire image, then its boundary is defined as the
set of pixels in the first and last rows and columns of the image.
84
Question 1
1 1 1
Region 1
1 0 1
0 1 0
0 0 1 Region 2
1 1 1
1 1 1
85
Question 2
1 1 1
Part 1
1 0 1
0 1 0
0 0 1 Part 2
1 1 1
1 1 1
86
► In the following arrangement of pixels, the two
regions (of 1s) are disjoint (if 4-adjacency is used)
1 1 1
Region 1
1 0 1
0 1 0
0 0 1 Region 2
1 1 1
1 1 1
87
► In the following arrangement of pixels, the two
regions (of 1s) are disjoint (if 4-adjacency is used)
1 1 1
foreground
1 0 1
0 1 0
0 0 1 background
1 1 1
1 1 1
88
Question 3
0 0 0 0 0
0 1 1 0 0
0 1 1 0 0
0 1 1 1 0
0 1 1 1 0
0 0 0 0 0
89
Question 4
0 0 0 0 0
0 1 1 0 0
0 1 1 0 0
0 1 1 1 0
0 1 1 1 0
0 0 0 0 0
90
Distance Measures
b. D(p, q) = D(q, p)
91
Distance Measures
a. Euclidean Distance :
De(p, q) = [(x-s)2 + (y-t)2]1/2
92
Question 5
0 0 0 0 0
0 0 1 1 0
0 1 1 0 0
0 1 0 0 0
0 0 0 0 0
0 0 0 0 0
93
Question 6
0 0 0 0 0
0 0 1 1 0
0 1 1 0 0
0 1 0 0 0
0 0 0 0 0
0 0 0 0 0
94
Question 7
0 0 0 0 0
0 0 1 1 0
0 1 1 0 0
0 1 0 0 0
0 0 0 0 0
0 0 0 0 0
95
Question 8
0 0 0 0 0
0 0 1 1 0
0 0 1 0 0
0 1 0 0 0
0 0 0 0 0
0 0 0 0 0
96
Introduction to Mathematical Operations in
DIP
► Array vs. Matrix Operation
97
Introduction to Mathematical Operations in
DIP
► Linear vs. Nonlinear Operation
H f ( x, y) g ( x, y)
H ai f i ( x, y ) a j f j ( x, y )
Additivity
H ai fi ( x, y ) H a j f j ( x, y )
ai H fi ( x, y ) a j H f j ( x, y ) Homogeneity
ai gi ( x, y ) a j g j ( x, y )
H is said to be a linear operator;
H is said to be a nonlinear operator if it does not meet the
above qualification.
98
Arithmetic Operations
99
Example: Addition of Noisy Images for Noise Reduction
100
Example: Addition of Noisy Images for Noise Reduction
1 K
g ( x, y ) g i ( x , y )
K i 1
1 K
E g ( x, y ) E g i ( x, y )
2
2 K
K i 1 g ( x,y ) 1
gi ( x , y )
1 K
K i 1
E f ( x, y ) ni ( x, y )
K i 1 1 2
2
n( x, y )
1 K
1 K
K
ni ( x , y )
f ( x, y ) E ni ( x, y ) K i 1
K i 1
f ( x, y )
101
Example: Addition of Noisy Images for Noise Reduction
102
103
An Example of Image Subtraction: Mask Mode Radiography
104
105
An Example of Image Multiplication
106
Set and Logical Operations
107
Set and Logical Operations
► Let A be the elements of a gray-scale image
The elements of A are triplets of the form (x, y, z), where
x and y are spatial coordinates and z denotes the intensity
at the point (x, y).
A {( x, y, z) | z f ( x, y)}
► The complement of A is denoted Ac
Ac {( x, y, K z ) | ( x, y, z ) A}
K 2k 1; k is the number of intensity bits used to represent z
108
Set and Logical Operations
► The union of two gray-scale images (sets) A and B is
defined as the set
A B {max(a, b) | a A, b B}
z
109
Set and Logical Operations
110
Set and Logical Operations
111
Spatial Operations
► Single-pixel operations
Alter the values of an image’s pixels based on the intensity.
s T ( z)
e.g.,
112
Spatial Operations
► Neighborhood operations
113
Spatial Operations
► Neighborhood operations
114
Geometric Spatial Transformations
( x, y) T{(v, w)}
— intensity interpolation that assigns intensity values to the spatially
transformed pixels.
► Affine transform
t11 t12 0
x y 1 v w 1 t21 t22 0
t31 t32 1
115
116
Intensity Assignment
► Forward Mapping
( x, y) T{(v, w)}
It’s possible that two or more pixels can be transformed to the same
location in the output image.
► Inverse Mapping
(v, w) T 1{( x, y)}
The nearest input pixels to determine the intensity of the output pixel
value.
Inverse mappings are more efficient to implement than forward
mappings.
117
Example: Image Rotation and Intensity
Interpolation
118
Image Registration
119
Image Registration
x c1v c2 w c3vw c4
y c5v c6 w c7 vw c8
120
Image Registration
121
Image Transform
M 1 N 1
f ( x, y ) T (u, v) s( x, y, u, v)
u 0 v 0
123
Image Transform
124
Example: Image Denoising by Using DCT Transform
125
Forward Transform Kernel
M 1 N 1
T (u , v) f ( x, y )r ( x, y, u , v)
x 0 y 0
M 1 N 1
T (u, v) f ( x, y )e j 2 ( ux / M vy / N )
x 0 y 0
M 1 N 1
1
f ( x, y )
MN
T (u, v)e
u 0 v 0
j 2 ( ux / M vy / N )
128
Probabilistic Methods
Let zi , i 0, 1, 2, ..., L -1, denote the values of all possible intensities
in an M N digital image. The probability, p( zk ), of intensity level
zk occurring in a given image is estimated as
nk
p ( zk ) ,
MN
where nk is the number of times that intensity zk occurs in the image.
L 1
p( z ) 1
k 0
k
k 0
130
Example: Comparison of Standard Deviation
Values
131
Homework
http://cramer.cs.nmt.edu/~ip/assignments.html
132