Linear
Linear
Linear
inj Wkj * Ik
ai = output(inj)
j
Computation:
input signals input function(linear) activation
function(nonlinear) output signal
Part 1. Perceptrons: Simple NN
inputs
weights
x1 w1
activation output
w2
x2
y
. a=i=1n wi xi
.
. wn
xn Xi’s range: [0, 1]
1 if a
y= { 0 if a <
Decision Surface of a Perceptron
1 1 Decision line
x2
w w1 x1 + w2 x2 =
1
0
0
0
x1
1
0 0
Linear Separability
w1=1 x2 w1=?
w2=1 w2=? 0 1
0 1
=1.5 x1 = ? x1
0 0 1 0
=w0
x0=-1
x1 w1
w0
w2
x2 y
. a= i=0n wi xi
.
. wn
xn 1 if a
y= { 0 if a <
Thus, y= sgn(a)=0 or 1
Training the Perceptron
Training set S of examples {x,t}
x is an input vector and
t the desired target vector
Output nodes
Hidden nodes
Input nodes
Input vector
Can use multi layer to learn
nonlinear functions
w1=? How to set the weights?
w2=? 0 1
= ? 3
x1
1 0 x1 w35
w23
5
Logical XOR
x2
x1 x2 y
4
0 0 0
0 1 1
1 0 1
1 1 0
Examples