Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
0% found this document useful (0 votes)
85 views2 pages

HWK 2

Download as pdf or txt
Download as pdf or txt
Download as pdf or txt
You are on page 1/ 2

Wang: CSE 5526, Homework

1
CSE 5526 - Autumn 2013
Introduction to Neural Networks

Homework #2
Due Tuesday, Sept. 17

Grader: Kun Han
Office: 418 Caldwell, 688-3711
Office hours: 1:30-2:30 T & W
Email: hank@cse.ohio-state.edu (preferred)

Problem 1. (a) For the following training samples:
x
1
=(0, 0)
T
C
1

x
2
=(0, 1)
T
C
1

x
3
=(1, 0)
T
C
2

x
4
=(1, 1)
T
C
2


Plot them in input space. Apply the perceptron learning rule to the above samples one-at-
a-time to obtain weights that separate the training samples. Set to 0.5. Work in the
space with the bias as another input element. Use w(0) =(0, 0, 0)
T
. Write the expression
for the resulting decision boundary.
(b) XOR. For x
2
, x
3
C
1
and x
1
, x
4
C
2
, describe your observation when you
apply the perceptron learning rule following the same procedure as in (a).


Problem 2. The following figure shows the decision regions of four classes. Design a
classifier for these linearly inseparable classes, using a network of M-P neurons with
three output units. For class i (1 i 3), classification requires that y
i
=1, while y
j
=-1
for j i; Class 4 is recognized when y
i
=-1 for 1 i 3. (HINT: try a two-layer
feedforward network.)

Class 1
Class 2
Class 3
Class 4
0 2 5 8 - 4
2
4
5
2
1
x
x
Wang: CSE 5526, Homework
2


Problem 3. Given the following input points and corresponding desired outputs:

X ={-0.5, -0.2, -0.1, 0.3, 0.4, 0.5, 0.7}
D ={-1, 1, 2, 3.2, 3.5, 5, 6}

write down the cost function with respect to w (setting the bias to zero). Compute the
gradient at the point w =2 using both direct differentiation and LMS approximation
(average for all data samples in both cases), and see if they agree.

You might also like