Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

Patter Recognition (Spring 2010) Midterm Exam: and ω are distributed according to

Download as pdf or txt
Download as pdf or txt
You are on page 1of 4

0907542 Patter Recognition (Spring 2010)

Midterm Exam
:‫رقم الشعبة‬ :‫رقم التسجيل‬ :‫االسم‬
==================================================================================
Instructions: Time 60 min. Closed books & notes. No calculators or mobile phones. No questions are allowed. Show
your work clearly. Every problem is for 10 marks.
==================================================================================
Q1. Consider a two-class two-dimensional classification task, where the feature vectors in each of the classes ω1 and
ω2 are distributed according to
𝟏 𝟏 𝑻
𝒑 𝒙 𝝎𝟏 = 𝟐
𝐞𝐱𝐩 − 𝒙 − 𝝁𝟏 𝒙 − 𝝁𝟏 ⁡
𝟐𝝈𝟐𝟏
𝟐𝝅𝝈𝟐𝟏

𝟏 𝟏 𝑻
𝒑 𝒙 𝝎𝟐 = 𝟐
𝐞𝐱𝐩⁡ − 𝒙 − 𝝁𝟐 𝒙 − 𝝁𝟐
𝟐𝝈𝟐𝟐
𝟐𝝅𝝈𝟐𝟐

with
𝝁𝟏 = 𝟏, 𝟏 𝑻 , 𝝁𝟐 = 𝟏. 𝟓, 𝟏. 𝟓 𝑻 , 𝝈𝟐𝟏 = 𝝈𝟐𝟐 = 𝟎. 𝟐
Assume that 𝑷 𝝎𝟏 = 𝑷(𝝎𝟐 ) and design a Bayesian classifier that minimizes the error probability.
Solution:
𝒑 𝒙 𝝎𝟏 𝑷 𝝎𝟏 >< 𝒑 𝒙 𝝎𝟐 𝑷(𝝎𝟐 )
𝒑 𝒙 𝝎𝟏 >< 𝒑 𝒙 𝝎𝟐
𝟏 𝟏 𝑻
𝟏 𝟏 𝑻
𝟐
𝐞𝐱𝐩 − 𝒙 − 𝝁𝟏 𝒙 − 𝝁𝟏 >< 𝐞𝐱𝐩⁡ − 𝒙 − 𝝁𝟐 𝒙 − 𝝁𝟐
𝟐𝝈𝟐𝟏 𝟐
𝟐𝝈𝟐𝟐
𝟐𝝅𝝈𝟐𝟏 𝟐𝝅𝝈𝟐𝟐

𝟏 𝑻
𝟏 𝑻
𝐞𝐱𝐩 − 𝒙 − 𝝁𝟏 𝒙 − 𝝁𝟏 >< 𝐞𝐱𝐩⁡ − 𝒙 − 𝝁𝟐 𝒙 − 𝝁𝟐
𝟐𝝈𝟐𝟏 𝟐𝝈𝟐𝟐
𝟏 𝑻
𝟏 𝑻
− 𝒙 − 𝝁𝟏 𝒙 − 𝝁𝟏 >< − 𝒙 − 𝝁𝟐 𝒙 − 𝝁𝟐
𝟐𝝈𝟐𝟏 𝟐𝝈𝟐𝟐
𝟏 𝑻
𝟏 𝑻
𝒙 − 𝝁𝟏 𝒙 − 𝝁𝟏 <> 𝒙 − 𝝁𝟐 𝒙 − 𝝁𝟐
𝟐𝝈𝟐𝟏 𝟐𝝈𝟐𝟐
𝑻 𝑻
𝒙 − 𝝁𝟏 𝒙 − 𝝁𝟏 <> 𝒙 − 𝝁𝟐 𝒙 − 𝝁𝟐
𝑇 𝑇
𝑥1 − 1 𝑥1 − 1 𝑥 − 1.5 𝑥1 − 1.5
<> 1
𝑥2 − 1 𝑥2 − 1 𝑥2 − 1.5 𝑥2 − 1.5
𝑥1 − 1 2 + 𝑥2 − 1 2 <> 𝑥1 − 1.5 2 + 𝑥2 − 1.5 2
−2𝑥1 + 1 − 2𝑥2 + 1 <> −3𝑥1 + 2.25 − 3𝑥2 + 2.25
𝑥1 + 𝑥2 <> 2.5

1 of 4
Q2. Consider a case in which class ω1 consists of two feature vectors 0, 0 𝑻 and 0, 1 𝑻 and class ω2 of 𝟏, 𝟎 𝑻
and 𝟏, 𝟏 𝑻 . Use the perceptron algorithm in its form shown below, with 𝜌𝑡 = 0.7 and 𝒘 0 = [−0.4, 1 1]𝑇 ,
to design the line separating the two classes. Draw the samples and the resulting classification line.
𝒘 𝑡 + 1 = 𝒘 𝑡 − 𝜌𝑡 𝜹𝒙 𝒙
𝒙∈𝒀
Solution:
𝒘 𝑡 + 1 = 𝒘 𝑡 − 𝜌𝑡 𝜹𝒙 𝒙
𝒙∈𝒀
−0.4 1 1 −0.4 1.4 −1.8
𝒘 1 = 𝒘 0 − 𝜌𝑡 𝜹𝒙 𝒙 = 1 − 0.7 +1 0 − 0.7 +1 1 = 1 − 0.7 = 0.3
𝒙∈𝒀 1 1 1 1 1.4 −0.4
−1.8 0 0 −1.8
𝒘 2 = 𝒘 1 − 𝜌𝑡 𝜹𝒙 𝒙 = 0.3 − 0.7 −1 0 − 0.7 −1 1 = 1
𝒙∈𝒀 −0.4 1 1 1
−1.8 1 −2.5
𝒘 3 = 1 − 0.7 +1 1 = 0.3
1 1 0.3
−1.8 1 −2.5
𝒘 3 = 1 − 0.7 +1 1 = 0.3
1 1 0.3
𝒘 3 𝑡𝒙 = 0
−2.5𝑥1 + 0.3𝑥2 + 0.3 = 0

2 of 4
Q3. You are considering building a classifier using two of the three features shown in the table below. Given the nine
samples shown in this table, select two features using the scatter criterion method.
Sample x1 x2 x3 Class
1 5.5 4 31 1
2 5 4 35 1
3 6 1 33 1
4 3 0 27 2
5 1 0 25 2
6 2 3 23 2
7 7.5 5 40 3
8 8 5 50 3
9 7 5 60 3

Mean Diff *Diff Within between


Sample x1 x2 x3 Class x1 x2 x3 x1 x2 x3 x1 x2 x3
1 5.5 4 31 1 5.5 3 33 0 1 4 0.167 2 2.667 0.25 0 9
2 5 4 35 1 5.5 3 33 0.25 1 4
3 6 1 33 1 5.5 3 33 0.25 4 0
4 3 0 27 2 2 1 25 1 1 4 0.667 2 2.667 9 4 121
5 1 0 25 2 2 1 25 1 1 0
6 2 3 23 2 2 1 25 0 4 4
7 7.5 5 40 3 7.5 5 50 0 0 100 0.167 0 66.67 6.25 4 196
8 8 5 50 3 7.5 5 50 0.25 0 0
9 7 5 60 3 7.5 5 50 0.25 0 100
5 3 36 1 4 72 15.5 8 326

J 16.5 3 5.528

Select x1 and x3.

3 of 4
<Good Luck>

4 of 4

You might also like