Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

A Novel Face Recognition Measure Using Normalized Unmatched Points

Download as pdf or txt
Download as pdf or txt
You are on page 1of 65

A Novel Face Recognition Measure using Normalized

Unmatched Points
Aditya Nigam
Supervisor: Dr. Phalguni Gupta
Department of Computer Science and Engineering
Indian Institute of Technology Kanpur
May 13, 2009
Table of contents
1
Problem Denition
Statement and Motivation
2
Related Work
HD and its Variants
3
Proposed NUP Measure
Pre-Processing
Dening NUP Measure
4
Ecient Computation of NUP
Algorithm
Analysis
5
Experimental Results and Analysis
Setup
Results
6
Conclusion
Aditya Nigam (M.Tech. CSE) Normalized Unmatched Points May 13, 2009 2 / 58
Problem Denition
Face picture acquisition under the same physical conditions is not
always possible.
Dierent face recognition algorithms perform poorly in typical varying
environments.
Varying illumination, poses, lighting conditions, expressions,
backgrounds, scales causes a lot of variation in pixels intensities, and
hence dierent algorithms performance got severely aected.
So we require an algorithm that is robust enough to small amount of
such variations.
Aditya Nigam (M.Tech. CSE) Normalized Unmatched Points May 13, 2009 3 / 58
Motivation
Edge images are less aected by illumination variations, but they
dont carry overall facial appearance they contains primarily the
structure of the faces.
Gray images cant be used directly as they are aected by this
illumination variation.
NUP measure can compare the gray images and is found to be robust
to slight variation in pose, expression and illumination.
Aditya Nigam (M.Tech. CSE) Normalized Unmatched Points May 13, 2009 4 / 58
Hausdor Distance (HD)
Conventional Hausdor distance is dissimilarity between two set of
points.
Let A = a
1
, a
2
, a
3
, a
4
..a
m
and B = b
1
, b
2
, b
3
, b
4
..b
n
be two Set
of points then, undirected Hausdor distance [8] between A and B is
dened as:
HD(A, B) = HD(B, A) = max(hd(A, B), hd(B, A))
here hd(A,B) is the directed Hausdor distance dened by:
Directed hd
hd(A, B) = max
aA
min
bB
|a b|
and, |.| is the norm of the vector.
Aditya Nigam (M.Tech. CSE) Normalized Unmatched Points May 13, 2009 5 / 58
HD Example
and Correspondance
Min Value
10(1-a)
1 corresponds to a
2 corresponds to a
8(2-a)
3 corresponds to a
12(3-a)
'
&
$
%
'
&
$
%
1
2
3
a
b
Distances Max Value
10
14
8
10
12
15
[Most Dissimilar Points]
12(3-a)
This is the worst
correspondance
1-a
1-b
2-a
2-b
3-a
3-b
Pairs of Points
SET A
SET B
Figure: Example hd(A,B)
Aditya Nigam (M.Tech. CSE) Normalized Unmatched Points May 13, 2009 6 / 58
PHD
HD measure does not work well when some part of the object is
occluded or missing.
For partial matching partial Hausdor distance PHD was introduced.
Undirected PHD is dened as:
PHD(A, B) = PHD(B, A) = max(phd(A, B), phd(B, A))
here phd(A,B) is the directed PHD, which is dened by:
Directed phd
phd(A, B) = K
th
max
aA
min
bB
|a b|
Both HD and PHD works on edge map and can tolerate small
amount of local and non-rigid distortion.
Aditya Nigam (M.Tech. CSE) Normalized Unmatched Points May 13, 2009 7 / 58
MHD
MHD [15] has been introduced that uses averaging which is a linear
function which makes it less sensitive to noise.
Undirected MHD is dened as:
MHD(A, B) = MHD(B, A) = max(mhd(A, B), mhd(B, A))
here mhd(A,B) is the directed MHD, which is dened by:
Directed mhd
mhd(A, B) =
1
N
a

aA
min
bB
|a b|
Where N
a
is the number of points in set A.
Aditya Nigam (M.Tech. CSE) Normalized Unmatched Points May 13, 2009 8 / 58
M2HD
MHD is improved to M2HD [10] by adding 3 more parameters :
Parameters
Neighborhood function (NNN
a
B
) Nhood of the point a in set B
Indicator variable (I) I = 1 if as corresponding point lie in N
a
B
else I = 0
Associated penalty (P) if I = 0 penalize with this penalty
and directed M2HD is dened as:
Directed m2hd
m2hd(A, B) =
1
N
a

aA
d(a, B)
Where d(a,B) is dened as:
d(a, B) = max[(I min
bN
a
B
|a b|), ((1 I ) P)]
Aditya Nigam (M.Tech. CSE) Normalized Unmatched Points May 13, 2009 9 / 58
SWHD and SW2HD
For better discriminative power HD and M2HD measures were
improved by assigning the weights to every point according to its
spatial information.
Crucial facial feature points like eyes and mouth are approximated by
the rectangular windows and are given more importance than others.
Directed SWHD and SW2HD [11] were dened as:
Directed swhd and sw2hd
swhd(A, B) = max
aA

w(b) min
bB
|a b|

sw2hd(A, B) =
1
N
a

aN
a

w(b) min
bB
|a b|

Aditya Nigam (M.Tech. CSE) Normalized Unmatched Points May 13, 2009 10 / 58
Spatial Weighing Function
Where w(x) is dened as:
Weighing Function
w(x) =

1 x Important facial region


W x Unimportant facial region
0 x Background region
Figure: Spatial Weighing Function
Aditya Nigam (M.Tech. CSE) Normalized Unmatched Points May 13, 2009 11 / 58
SEWHD and SEW2HD
Rough estimation of facial features cannot fully reect the exact
structure of human face.
Regions where the dierence among the training images is large, the
corresponding regions at the eigenfaces will have large magnitude.
Eigenfaces appears as light and dark areas arranged in a specic
pattern. Regions where the dierence among the training images is
large, the corresponding regions in the eigenfaces will have large
magnitude.
Eigen Weighing
Eigen faces can be used as weighing function because they represents the
most signicant variations in the set of training face images.
Aditya Nigam (M.Tech. CSE) Normalized Unmatched Points May 13, 2009 12 / 58
Eigen Faces
Figure: Eigenfaces
Aditya Nigam (M.Tech. CSE) Normalized Unmatched Points May 13, 2009 13 / 58
Dening SEWHD and SEW2HD
Proposed SEWHD and SEW2HD [12] are dened as:
Directed sewhd and sew2hd
sewhd(A, B) = max
aA

w
e
(b) min
bB
|a b|

sew2hd(A, B) =
1
N
a

aN
a

w
e
(b) min
bB
|a b|

where w
e
(x) is dened as:
w
e
(x) = Eigen weight function generated by the rst eigen vector
Aditya Nigam (M.Tech. CSE) Normalized Unmatched Points May 13, 2009 14 / 58
H
g
and H
pg
Edge images loose most of the important facial features, which are
very useful for facial discrimination.
H
g
and H
pg
[13] measures works on quantized images and are found
robust to slight variation in poses, expressions and illumination.
Quantized Images
Images with n 5 retains the perceptual appearance and the intrinsic
facial feature information that resides in gray values (as shown in Figure
below).
Figure: Quantized-faces
Aditya Nigam (M.Tech. CSE) Normalized Unmatched Points May 13, 2009 15 / 58
Dening H
g
and H
pg
H
g
and H
pg
are dened on quantized gray images as :
Directed h
g
and h
pg
h
g
(A, B) = max
i =0..2
n
1
aA
i
d(a, B
i
)
h
pg
(A, B) = K
th
max
i =0..2
n
1
aA
i
d(a, B
i
)
where d(a, B
i
) is dened as :
d(a, B
i
) =

min
bB
i
|a b| if B
i
is non-empty
L otherwise
A
i
and B
i
are the set of pixels in quantized images A and B having quantized gray value i .
Aditya Nigam (M.Tech. CSE) Normalized Unmatched Points May 13, 2009 16 / 58
NUP Measure
NUP
NUP measure can be applied on gt-transformed images obtained from
gray-scale facial images.
NUP measure is similar to the HD based measures but is
computationally less expensive and more accurate.
NUP also shows robustness against slight variation in pose, expression
and illumination.
Aditya Nigam (M.Tech. CSE) Normalized Unmatched Points May 13, 2009 17 / 58
Transformation
A pixels relative gray value in its neighborhood can be more stable
than its own gray value.
SK-transformation [14] provides some robustness against illumination
variation and local non-rigid distortions by converting gray scale
images into transformed images that preserve intensity distribution.
Every pixel is represented by an 8-element vector which in itself can
store the sign of rst-order derivative with respect to its
8-neighborhood.
Property of SK-transformed images
Gray value of pixels are being changed in dierent poses of the same
subject but their corresponding vector do not change by a great extent.
Aditya Nigam (M.Tech. CSE) Normalized Unmatched Points May 13, 2009 18 / 58
Example
-1 1 1 1 -1 -1 -1 0
-1
0
-1
-1
X
1
-1
1
1
12
170
82
122
170
220
26
178
211
-
?
Gray values
Transformed vector
Sign of rst-order derivative
Problem
The above property holds when gray values of neighborhood pixels are
not too close to each other.
Usually, we have small variations in the gray values (e.g. in
background, facial features etc.), where the above property fails to
hold.
Aditya Nigam (M.Tech. CSE) Normalized Unmatched Points May 13, 2009 19 / 58
Observation
Figure: Gray-value spectrum.
Gray levels are hardly distinguishable (Similar) within a range of 5 units.
Aditya Nigam (M.Tech. CSE) Normalized Unmatched Points May 13, 2009 20 / 58
Improvement
Basic Comparator
X

= X
< (X, 255]
> [0, X)
gt-Comparator
X

= [(X gt), (X + gt)]


< (X + gt, 255]
> [0, X gt)
Where
gt is gray value tolerance, gt 0.
X is a gray level not merely a number.
Gray level X is neither greater than gray level (X 1) nor less than
gray level (X + 1); ideally they should be considered as similar.
Aditya Nigam (M.Tech. CSE) Normalized Unmatched Points May 13, 2009 21 / 58
Improvement
Basic Comparator
X

= X
< (X, 255]
> [0, X)
gt-Comparator
X

= [(X gt), (X + gt)]


< (X + gt, 255]
> [0, X gt)
Where
gt is gray value tolerance, gt 0.
X is a gray level not merely a number.
Gray level X is neither greater than gray level (X 1) nor less than
gray level (X + 1); ideally they should be considered as similar.
Aditya Nigam (M.Tech. CSE) Normalized Unmatched Points May 13, 2009 21 / 58
Improvement
Basic Comparator
X

= X
< (X, 255]
> [0, X)
gt-Comparator
X

= [(X gt), (X + gt)]


< (X + gt, 255]
> [0, X gt)
Where
gt is gray value tolerance, gt 0.
X is a gray level not merely a number.
Gray level X is neither greater than gray level (X 1) nor less than
gray level (X + 1); ideally they should be considered as similar.
Aditya Nigam (M.Tech. CSE) Normalized Unmatched Points May 13, 2009 21 / 58
Diagrammatically
X
REGION : LESS THAN X
REGION : GREATER THAN X
Figure: Basic Comparator
X
REGION : EQUAL TO X
REGION : LESS THAN X REGION : GREATER THAN X
(X+gt) (X-gt)
Figure: gt-Comparator
Aditya Nigam (M.Tech. CSE) Normalized Unmatched Points May 13, 2009 22 / 58
gt-Transformation
Any pixel a is represented by an 8-element vector V(a) whose
elements are drawn from the set 0, 1, 2.
The decimal equivalent of the V(a) is called the transformed value of
the pixel a, ranging from 0 to 6560 (= 3
8
1).
Stability
In typical varying environment transformed value of a pixel remains more
stable than its corresponding gray value.
Aditya Nigam (M.Tech. CSE) Normalized Unmatched Points May 13, 2009 23 / 58
gt-Transformed Images
Encoding
Less Than < RED i.e.[0], Equal To = BLUE i.e [1], Greater Than > GREEN i.e. [2].
Figure: gt-Transformed images
Aditya Nigam (M.Tech. CSE) Normalized Unmatched Points May 13, 2009 24 / 58
Notations
Parameter Description
Parameter Description
AAA | BBB The corresponding gt-transformed images (r 2)(c2), bound-
ary pixels are ignored;
N
a
B
N
a
B
N
a
B
Neighborhood of pixel a in image B;
V(a) V(a) V(a) The 8-element vector at pixel a;
tval a tval a tval a The decimal equivalent of V(a), i.e. the transformed value of
pixel a;
NUP(A, B) NUP(A, B) NUP(A, B) Undirected Normalized Unmatched Points measure between A
and B;
nup(A, B) nup(A, B) nup(A, B) Directed Normalized Unmatched Points measure, when A is com-
pared with B;
ppp Order of the norm ;
N
a
N
a
N
a
Total number of pixels in image A;
N
U
AB
N
U
AB
N
U
AB
Total number of unmatched pixels of A, when A is compared
with B;
Compare(A, B) Compare(A, B) Compare(A, B) Compares image A to image B, and returns N
U
AB
;
Match(a, B) Match(a, B) Match(a, B) Matches a pixel a with B, and returns 1 if Matched or 0 if Un-
matched;
Aditya Nigam (M.Tech. CSE) Normalized Unmatched Points May 13, 2009 25 / 58
Dening N
a
B
Neighborhood of pixel a in image B
Pixels within a distance of d

2 from pixel a is considered to be in its


neighborhood.
Neighborhood
N
a
B
= b B [ |a b| d

2
Aditya Nigam (M.Tech. CSE) Normalized Unmatched Points May 13, 2009 26 / 58
Dening Compare(A, B) and N
U
AB
Compare(A, B) compares two gt-transformed images A and B.
Returns N
U
AB
(i.e. Total number of unmatched pixels of A, when A is
compared with B), dened as:
Unmatched Points
N
U
AB
=

aA
(1 Match(a, B))
Aditya Nigam (M.Tech. CSE) Normalized Unmatched Points May 13, 2009 27 / 58
Dening Match(a, B)
Match(a, B) matches a pixel a with a gt-transformed image B.
Returns 1 if there is a pixel within the neighborhood of a in image B,
having same gt-transformed value (i.e. Matched),
Else Returns 0 (i.e. Unmatched).
Match(a, B) can be dened as:
Matching
Match(a, B) =

1 If
bN
a
B
V(a) = V(b) [i.e. Matched]
0 else
Aditya Nigam (M.Tech. CSE) Normalized Unmatched Points May 13, 2009 28 / 58
Dening NUP(A, B) and nup(A, B)
NUP(A, B) is dened as:
Undirected NUP
NUP(A, B) = |nup(A, B), nup(B, A))|
p
where nup(A, B) is dened as:
Directed nup
nup(A, B) =
N
U
AB
N
a
and |.|
p
is the p
th
norm.
Aditya Nigam (M.Tech. CSE) Normalized Unmatched Points May 13, 2009 29 / 58
Some Properties of NUP and nup
Properties
1
NUP(A, B) = NUP(B, A).
2
If nup(A, B) = K, then K N
a
pixels of A do not have any pixel with
same transformed value within its neighborhood in B.
3
NUP(A, B) and nup(A, B) are always positive and normalized
between 0 and 1.
4
NUP(A, B) and nup(A, B) are parameterized by gt, d and p.
Aditya Nigam (M.Tech. CSE) Normalized Unmatched Points May 13, 2009 30 / 58
Ecient Match(a,B)
Computing NUP(A, B) using naive method requires O(r
2
c
2
) time ,
which is prohibitively computationally intensive.
Performing Match(a, B) operation eciently an array of pointers to
linked list BLIST is created.
BLIST
It has 3
8
elements such that i [0, 3
8
1] the i
th
element points to a
linked list of pixels having the transformed value i [14].
Aditya Nigam (M.Tech. CSE) Normalized Unmatched Points May 13, 2009 31 / 58
Date-Structure BLIST
-
-
-
-
-
- - -
- - -
- -
-
- -
-
2
1
0
22222222
00000010
00000002
00000001
00000000
i i in base 3
6560
Linked list of pixels having T-value i
3
T-Value
Figure: Data Structure: BLIST
Aditya Nigam (M.Tech. CSE) Normalized Unmatched Points May 13, 2009 32 / 58
Time Complexity
Preprocessing
Gray scale images sized r c transformed into gt-Transformed
images. It is done once and single scan of the whole image is
sucient.
Time complexity is O(rc).
Processing
Constructing data structure BLIST require O(rc) time.
Match function involves linear search of a linked list of pixels.
Time taken by Match depends on the length of the list. Assuming
that k is the length of the largest linked list.
Computing NUP(A,B), Match(a, B) function has to be called 2rc
times, therefore time required to compute NUP will be O(krc).
Aditya Nigam (M.Tech. CSE) Normalized Unmatched Points May 13, 2009 33 / 58
Setup
Figure: Images produced after various phases
Aditya Nigam (M.Tech. CSE) Normalized Unmatched Points May 13, 2009 34 / 58
Testing Strategy
Whole database is treated as the testing set, then each image of the
testing set is matched with all other images excluding itself. Finally
top n best matches are reported.
Match is announced if and only if a subjects image got matched with
another pose of himself/herself.
Recognition Rate
Recognition rate =
Number of matches
(Total number of images) n
Aditya Nigam (M.Tech. CSE) Normalized Unmatched Points May 13, 2009 35 / 58
Parameterized Analysis
Parameters
NUP measure is parameterized primarily by two parameters gt and d,
the third parameter p (order of norm) is set to 20 for this work.
Gray value Tolerance gt can vary within range [0, 5].
Neighborhood parameter d can vary within range [1, 15].
Databases Vs Parameters
Db,Nor S,P,T Time gt,d,RR% [top1] gt,d,RR% [top5] Varying
ORL,N 40, 10, 400 1.8 5, 8, 99.75 5, 10, 90.15 Poses and Expressions
YALE,Y 15, 11, 165 1.2 1, 1, 92.75 0, 2, 85.57 Illumination and Expressions
BERN,N 30, 10, 300 1.6 5, 5, 98.66 5, 8, 75.80 Poses and Expressions
CALTECH,Y 17, 20, 340 1.6 1, 2, 98.23 0, 2, 95.64 Poses and Illumination
IITK,N 149, 10, 1490 4.6 5, 5, 99.73 4, 5, 99.58 Poses and Scale
Table: Databases vs Parameters
Aditya Nigam (M.Tech. CSE) Normalized Unmatched Points May 13, 2009 36 / 58
Big Illumination Variation [use gt = 0]
Eect of High gt values under heavy illumination variation
With higher gt values more and more elements of V(a) start
acquiring value 1.
This will boost the blue value of pixels in the gt-transformed images.
Directional lights and heavy illumination condition variations may
further lift up the blue value upto an extent that blue color starts
dominating in gt-transformed image.
Aditya Nigam (M.Tech. CSE) Normalized Unmatched Points May 13, 2009 37 / 58
ORL:Pose and Expression Variations
Aditya Nigam (M.Tech. CSE) Normalized Unmatched Points May 13, 2009 38 / 58
ORL:top 1 [gt = 5, d = 8, RR = 99.75%]
Aditya Nigam (M.Tech. CSE) Normalized Unmatched Points May 13, 2009 39 / 58
ORL:top 5 [gt = 5, d = 10, RR = 90.15%]
Aditya Nigam (M.Tech. CSE) Normalized Unmatched Points May 13, 2009 40 / 58
YALE:Illumination and Expression Variations
Aditya Nigam (M.Tech. CSE) Normalized Unmatched Points May 13, 2009 41 / 58
YALE:top 1 [gt = 1, d = 1, RR = 92.75%]
Aditya Nigam (M.Tech. CSE) Normalized Unmatched Points May 13, 2009 42 / 58
YALE:top 5 [gt = 0, d = 2, RR = 85.57%]
Aditya Nigam (M.Tech. CSE) Normalized Unmatched Points May 13, 2009 43 / 58
BERN:Big Pose and Expression Variations
Aditya Nigam (M.Tech. CSE) Normalized Unmatched Points May 13, 2009 44 / 58
BERN:top 1 [gt = 5, d = 5, RR = 98.66%]
Aditya Nigam (M.Tech. CSE) Normalized Unmatched Points May 13, 2009 45 / 58
BERN:top 5 [gt = 5, d = 8, RR = 75.80%]
Aditya Nigam (M.Tech. CSE) Normalized Unmatched Points May 13, 2009 46 / 58
CALTECH:Small Pose, Expression, Illumination and
Background Variation
Aditya Nigam (M.Tech. CSE) Normalized Unmatched Points May 13, 2009 47 / 58
CALTECH:top 1 [gt = 1, d = 2, RR = 98.23%]
Aditya Nigam (M.Tech. CSE) Normalized Unmatched Points May 13, 2009 48 / 58
CALTECH:top 5 [gt = 0, d = 2, RR = 95.64%]
Aditya Nigam (M.Tech. CSE) Normalized Unmatched Points May 13, 2009 49 / 58
IITK:Very Small Expression and Pose Variations
Aditya Nigam (M.Tech. CSE) Normalized Unmatched Points May 13, 2009 50 / 58
IITK:top 1 [gt = 5, d = 5, RR = 99.73%]
Aditya Nigam (M.Tech. CSE) Normalized Unmatched Points May 13, 2009 51 / 58
IITK:top 5 [gt = 4, d = 5, RR = 99.58%]
Aditya Nigam (M.Tech. CSE) Normalized Unmatched Points May 13, 2009 52 / 58
Comparative Analysis
ORL and YALE
Distance Recognition rate (%)
Measure ORL YALE
PCA 63 50
HD 46 66
PHD 72.08 (f = 0.85) 84 (f = 0.7)
M2HD 75 80
SWHD 82 82
SW2HD 88 83
SEWHD 88 85
SEW2HD 91 89
H
pg
91.25 83.3 (f = 0.55)
NUP 99.75 (gt = 5, d = 11) 92.73 (gt = 0, d = 1)
Table: Comparative study on ORL and YALE when considering top 1 best match
Aditya Nigam (M.Tech. CSE) Normalized Unmatched Points May 13, 2009 53 / 58
Comparative Analysis
BERN
Test Recognition rate (%)
Faces PHD LEM H
pg
NUP
(f = 0.85) (gt = 5, d = 5)
Looks right/left 74.17 74.17 95.83 99.00
Looks up 43.33 70.00 90.00 99.00
Looks down 61.66 70.00 68.33 98.00
Average 58.75 72.09 87.50 98.66
Table: Comparative study on BERN database when considering top 1 best match
Aditya Nigam (M.Tech. CSE) Normalized Unmatched Points May 13, 2009 54 / 58
Overall Analysis
Overall
Top-n ORL YALE CALTECH BERN IITK
1 99.75 92.72 98.23 98.66 99.73
2 98.63 89.7 98.08 89.33 99.73
3 97.10 88.11 97.25 83.77 99.66
4 94.87 86.51 96.40 79.41 99.63
5 90.15 85.57 95.64 75.80 99.58
6 86.13 83.23 94.46 71.33 99.55
7 82.10 79.74 93.27 66.57 99.41
8 78.50 73.11 92.42 62.12 99.14
9 74.01 67.20 91.30 57.70 98.05
Table: Overall Analysis (considering top-n best matched)
Aditya Nigam (M.Tech. CSE) Normalized Unmatched Points May 13, 2009 55 / 58
Overall Analysis
Figure: NUP measure on dierent face databases, top n best matches.
Aditya Nigam (M.Tech. CSE) Normalized Unmatched Points May 13, 2009 56 / 58
Future Work
In constrained environment which is uniformly well illuminated NUP
measure could also be used for video surveillance, scene segmentation
in videos, face detection, face authentication.
Fast First Level Scanner
For recognition in complex varying environments with big images it can
also be used as fast rst level scanner, working on under sampled images
providing assistance to the higher levels.
It can also be extended to other biometric traits as iris and ear.
Aditya Nigam (M.Tech. CSE) Normalized Unmatched Points May 13, 2009 57 / 58
Conclusion
Normalized Unmatched Points (NUP) measure proposed is dierent
from existing Hausdor distance based methods as it works on
gt-transformed images.
It is computationally inexpensive and provides good performance.
Parameters gt, d, p are set taking into account the illumination
variation and the nature of the images.
Discriminative Power
It has shown tolerance to varying poses, expressions and illumination
conditions and can achieve a higher recognition rate than HD, PHD,
MHD, M2HD, SWHD, SW2HD, SEWHD, SEW2HD, H
g
and H
pg
.
Aditya Nigam (M.Tech. CSE) Normalized Unmatched Points May 13, 2009 58 / 58
A.Samal and P.A.Iyengar,
Automatic recognition and analysis of human faces and facial
expressions; a survey,
Pattern recognition 25 (1) (1992) 65-77.
R.Chellappa, C.L.Wilson and S.Sircohey,
Human and machine recognition of faces: a survey,
Proc. IEEE 83 (5) (1995) 705-740.
M.Turk and A.Pentland,
Eigenfaces for recognition,
Journal of cognitive Neuroscience, March 1991.
L.Wiskott, J.-M.Fellous, N.Kuiger and C.Von der Malsburg,
Face recognition by elastic bunch graph matching,
IEEE Tran. on Pattern Anal.Mach.Intell., 19: 775-779.
S.Lawrence, C.L.Giles, A.C.Tsoi, and A. D. Back,
Face recognition: A convolutional neural network approach,
IEEE Trans. Neural Networks, 8:98-113, 1997.
Aditya Nigam (M.Tech. CSE) Normalized Unmatched Points May 13, 2009 58 / 58
Guodong Guo, Stan Z. Li, and Kapluk Chan,
Face Recognition by Support Vector Machines,
Automatic Face and Gesture Recognition, 2000.Proc. Fourth IEEE
Inter.Conf. on Volume,Issue,2000 Page(s):196-201
F.S.Samaria,
Face recognition using Hidden Markov Models. PhD thesis,
Trinity College, University of Cambridge,Cambridge,1994.
D.P.Huttenlocher, G.A.Klanderman and W.A.Rucklidge,
Comparing images using the Hausdor distance,
IEEE Trans.Pattern Anal.Mach.Intell,vol.15, no.9,pp.850-863,
sep.1993.
W.J.Rucklidge,
Locating objects using the Hausdor distance,
ICCV 95: Proc. 5th Int. Conf. Computer Vision, Washington, D.C,
June 1995, pp. 457-464.
B.Takacs,
Comparing face images using the modied Hausdor distance,
Aditya Nigam (M.Tech. CSE) Normalized Unmatched Points May 13, 2009 58 / 58
Pattern Recognit,vol.31, no.12,pp.1873-1881, 1998.
B.Guo, K.-M.Lam, K.-H.Lin and W.-C.Siu,
Human face recognition based on spatially weighted Hausdor
distance,
Pattern Recognit. Lett., vol. 24,pp.499-507, Jan. 2003.
K.-H.Lin, K.-M.Lam and W.-C.Siu,
Spatially eigen-weighted Hausdor distances for human face
recognition,
Pattern Recognit.,vol.36,pp.1827-1834, Aug. 2003.
E.P.Vivek and N.Sudha,
Gray Hausdor distance measure for comparing face images,
IEEE Trans. Inf. Forensics and Security, vol.1, no. 3, Sep. 2006.
N.Sudha and Y.Wong,
Hausdor distance for iris recognition,
Proc. of 22nd IEEE Int. Symp. on Intelligent Control ISIC 2007,pages
614-619,Singapore, October 2007.
M.Dubuisson and A.K.Jain,
Aditya Nigam (M.Tech. CSE) Normalized Unmatched Points May 13, 2009 58 / 58
A modied Hausdor distance for object Matching,
Proc. 12th Int. conf. on Pattern Recognition (ICPR), Jerusalem,
Israel, (1994).
Y.Gao and M.K.Leung,
Face recognition using line edgemap,
IEEE Trans. Pattern Anal. Machine Intell.,vol.24, pp.764-779, Jun.
2002.
The ORL Database of Faces[Online], Available:http:
//www.uk.research.att.com/facedatabase.html.
The Yale University Face Database[Online], Available:http:
//cvc.yale.edu/projects/yalefaces/yalefaces.html.
The Bern University Face Database[Online],
Available:ftp://ftp.iam.unibe.ch/pub/images/faceimages/.
The Caltech University Face Database[Online], Available:http:
//www.vision.caltech.edu/html-files/archive.html.
Aditya Nigam (M.Tech. CSE) Normalized Unmatched Points May 13, 2009 58 / 58
David A. Forsyth and Jean Ponce, Computer Vision - A Modern
Approach, Pearson Education, 2003.
Yang,M.H.;Kriegman,D.J. and Ahuja, N,
Detecting Faces in Images: A Survey,
IEEE Transaction (PAMI), Vol.24, No. 1, (2002),(34-58).
Li,S.Z and Jain, A.K
Handbook of Face Recognition,
Springer-Verlag, (2005)
Yuankui Hu and Zengfu Wang,
A Similarity Measure Based on Hausdor Distance for Human Face
Recognition,
18
th
International Conference on Pattern Recognition (ICPR06), IEEE
(2006).
Gary Bradski, Adrian Kaehler
Learning OpenCV: Computer Vision with the OpenCV Library,
[ONLINE ], Available at http://www.amazon.com/
Learning-OpenCV-Computer-Vision-Library/dp/0596516134
Aditya Nigam (M.Tech. CSE) Normalized Unmatched Points May 13, 2009 58 / 58

You might also like