ML 03
ML 03
ML 03
Perceptron
and
Multilayer Perceptron
2. Delta Rules
• For continuous activation function
• The aim of the delta rule is to minimize the error
over all training patterns
Perceptron Training Algorithm
1. Set initial weights w1, w2,…, wn and threshold θ to
random numbers. Normally [-0.5,0.5]
2. Calculate output:
n
X wi xi
i 1
Input
1 1 0 0 W1 = 0.3 Step function
Target output
n 1 for X
X wi xi Y
0 for X 1 0 0 0
i 1
0.2
1 0 1 0
W2 = -0.1
10
Perceptron Assignment:
i wij j wjk
xi k yk
m
n l yl
xn
Input Hidden Output
layer layer layer
Error signals
Activation Function
1
Sigmoid activation function: f ( x)
1 ex
Delta Rules
Often utilized by backpropagation
The network sees how far its answer was from the actual
one and makes an appropriate adjustment to its connection
weights.
Input
Desired
Output
Delta Rules
Backpropagation performs a gradient descent within the
solution's vector space towards a global minimum and
avoiding local minimum
Backpropagation Algorithm
1. Calculate the outputs of all neurons in the hidden layer:
n
x xi wi bias
i 1
1
O j f ( x)
1 ex
1
Ok f ( x)
1 ex
Backpropagation Algorithm
3. Calculate output error:
k Ok (1 Ok ) (t Ok )
w( jk ) O j k
w( jk ) (t 1) w( jk ) t w( jk )
w( ij ) xi j
w( ij ) ( t 1) w( ij ) t w( ij )
i = Input
j = Hidden
k = Output
Example:
Input (X) Weight (Input to Hidden) Weight (Hidden to Output)
X1 = 0.8 X1 to H1 = 0.3 H1 to O1 = 0.6
X2 = 0.5 X1 to H2 = 0.4 H2 to O1 = 0.9
X2 to H1 = 0.7
X2 to H2 = 0.9
Learning rate α = 0.6
Target Output = 1.0
0.3
0.8 X1 H1
0.6
0.7
O1 Target = 1.0
0.4
0.9
0.5 X2 H1
0.9
Example
1. Calculate the outputs of all neurons in the hidden layer:
0.3035
0.8 X1 H1
0.6204
0.7021
O1 Target = 1.0
0.4049
0.9217
0.5 X2 H1
0.9031
3
w13 1
x1 1 3 w35
w23 5
5 y5
w
w24
14 w45
x2 2 4
w24
Input 4 Output
layer layer
1
Hiddenlayer
The initial weights and threshold levels are set randomly as follows:
w13 = 0.5, w14 = 0.9, w23 = 0.4, w24 = 1.0, w35 = -1.2, w45 = 1.1
θ3 = 0.8, θ 4 = -0.1 and θ 5 = 0.3.
Example: XOR Problem
Sum-Squared Network Error for 224 Epochs
1
10
10 0
Sum-Squared Error
10 -1
10 -2
10 -3
10 -4
0 50 100 150 200
Epoch
Example: XOR Problem
Intensity Duration
Male Age Temp WBC Pain Pain
adjustable
1 20 37 10 1 1
weights
36
Lab Exercise 2
Data visualization
37
Lab Exercise 2
Classification using MLP
38
Lab Exercise 2
39
Lab Exercise 3
Implementation of MLP using C/C++/Python
40
MLP Assignment 1:
1. Hopfield Network
2. Kohonen Network
3. Self-Organizing Map (SOM)
And others….