Neural Networks: MATLAB
Neural Networks: MATLAB
Contents
1. nn02_neuron_output - Calculate the output of a simple neuron
2. nn02_custom_nn - Create and view custom neural networks
3. nn03_perceptron - Classification of linearly separable data with a perceptron
4. nn03_perceptron_network - Classification of a 4-class problem with a 2-neuron perceptron
5. nn03_adaline - ADALINE time series prediction with adaptive linear filter
6. nn04_mlp_xor - Classification of an XOR problem with a multilayer perceptron
7. nn04_mlp_4classes - Classification of a 4-class problem with a multilayer perceptron
8. nn04_technical_diagnostic - Industrial diagnostic of compressor connection rod defects [data2.zip]
9. nn05_narnet - Prediction of chaotic time series with NAR neural network
10. nn06_rbfn_func - Radial basis function networks for function approximation
11. nn06_rbfn_xor - Radial basis function networks for classification of XOR problem
12. nn07_som - 1D and 2D Self Organized Map
13. nn08_tech_diag_pca - PCA for industrial diagnostic of compressor connection rod defects [data2.zip]
Page 1 of 91
Neuron output
Neural Networks course (practical examples) 2012 Primoz Potocnik
PROBLEM DESCRIPTION: Calculate the output of a simple neuron
Contents
Neuron weights
= [4 -2]
Neuron bias
= -3
Activation function
func = 'tansig'
% func = 'purelin'
% func = 'hardlim'
% func = 'logsig'
w =
4
-2
b =
-3
func =
tansig
p =
2
activation_potential =
-1
neuron_output =
-0.7616
Page 3 of 91
Custom networks
Neural Networks course (practical examples) 2012 Primoz Potocnik
PROBLEM DESCRIPTION: Create and view custom neural networks
Contents
Configure network
inputs =
1
2
3
4
5
6
outputs =
1
2
Configure network
net = configure(net,inputs,outputs);
view(net);
initial_output =
0
0
final_output =
1.0000
2.0000
Page 6 of 91
Contents
Page 7 of 91
Page 8 of 91
Page 9 of 91
Contents
Define data
Create a perceptron
Train a perceptron
Define data
close all, clear all, clc, format compact
% number of samples of each class
K = 30;
% define classes
q = .6; % offset of classes
A = [rand(1,K)-q; rand(1,K)+q];
B = [rand(1,K)+q; rand(1,K)+q];
C = [rand(1,K)+q; rand(1,K)-q];
D = [rand(1,K)-q; rand(1,K)-q];
% plot classes
plot(A(1,:),A(2,:),'bs')
hold on
grid on
plot(B(1,:),B(2,:),'r+')
plot(C(1,:),C(2,:),'go')
plot(D(1,:),D(2,:),'m*')
% text labels for classes
text(.5-q,.5+2*q,'Class A')
text(.5+q,.5+2*q,'Class B')
text(.5+q,.5-2*q,'Class C')
text(.5-q,.5-2*q,'Class D')
%
a
b
c
d
%
%
%
%
%
%
%
%
%
%
c
%
a
b
d
c
= [1 0]';
Why this coding doesn't work?
= [0 1]';
= [1 1]';
= [1 0]';
= [0 1]';
Create a perceptron
net = perceptron;
Train a perceptron
ADAPT returns a new network object that performs as a better classifier, the network output, and the error. This loop allows the
network to adapt for xx passes, plots the classification line, and continues until the error is zero.
Page 11 of 91
E = 1;
net.adaptParam.passes = 1;
linehandle = plotpc(net.IW{1},net.b{1});
n = 0;
while (sse(E) & n<1000)
n = n+1;
[net,Y,E] = adapt(net,P,T);
linehandle = plotpc(net.IW{1},net.b{1},linehandle);
drawnow;
end
% show perceptron structure
view(net);
Page 12 of 91
p =
0.7000
1.2000
y =
1
1
Page 13 of 91
Contents
Plot results
Page 14 of 91
There are two basic types of input vectors: those that occur concurrently
(at the same time, or in no particular time sequence), and those that
occur sequentially in time. For concurrent vectors, the order is not
important, and if there were a number of networks running in parallel,
you could present one input vector to each of the networks. For
sequential vectors, the order in which the vectors appear is important.
= con2seq(y);
% define ADALINE
net = linearlayer(inputDelays,learning_rate);
% before the next step in the sequence is presented. Thus the network is
% updated N times. The output signal and the error signal are returned,
% along with new network.
[net,Y,E] = adapt(net,p,p);
% view network structure
view(net)
% check final network parameters
disp('Weights and bias of the ADALINE after adaptation')
net.IW{1}
net.b{1}
Plot results
% transform result vectors
Y = seq2con(Y); Y = Y{1};
E = seq2con(E); E = E{1};
% start a new figure
figure;
% first graph
subplot(211)
plot(t,y,'b', t,Y,'r--');
legend('Original','Prediction')
grid on
xlabel('Time [sec]');
ylabel('Target Signal');
ylim([-1.2 1.2])
% second graph
subplot(212)
plot(t,E,'g');
grid on
Page 16 of 91
legend('Prediction error')
xlabel('Time [sec]');
ylabel('Error');
ylim([-1.2 1.2])
Page 17 of 91
Contents
plot targets and network response to see how good the network learns the data
Page 18 of 91
Page 19 of 91
plot targets and network response to see how good the network learns the data
figure(2)
plot(T','linewidth',2)
hold on
plot(Y','r--')
grid on
legend('Targets','Network response','location','best')
ylim([-1.25 1.25])
view(2)
Page 21 of 91
Contents
Page 22 of 91
% show network
view(net)
Page 23 of 91
Page 24 of 91
Page 25 of 91
Page 26 of 91
Contents
Application
Page 27 of 91
Name
force
notes
target
Size
2000x100
1x3
2000x1
Bytes
1600000
222
16000
Page 29 of 91
Class
double
cell
double
Attributes
Name
force
notes
step
target
Size
Bytes
Class
2000x10
1x3
1x1
2000x1
160000
222
8
16000
double
cell
double
double
Page 31 of 91
Attributes
Page 32 of 91
Page 33 of 91
correct_classifications =
99.7500
Application
% get sample
random_index = randi(length(force))
sample = force(random_index,:);
% plot sample
figure
plot(force','c')
grid on, hold on
plot(sample,'b')
xlabel('Time')
ylabel('Force')
%
q
%
q
predict quality
= net(sample');
digitize network response
= double(q > threshold)'
random_index =
1881
q =
0
Page 34 of 91
Page 35 of 91
Contents
Prepare input and target time series data for network training
Train net
b
= 0.1;
c
= 0.2;
tau = 17;
% initialization
y = [0.9697 0.9699 0.9794 1.0003 1.0319 1.0703 1.1076 1.1352 1.1485 ...
1.1482 1.1383 1.1234 1.1072 1.0928 1.0820 1.0756 1.0739 1.0759]';
% generate Mackay-Glass time series
for n=18:N+99
y(n+1) = y(n) - b*y(n) + c*y(n-tau)/(1+y(n-tau).^10);
end
% remove initial values
y(1:100) = [];
% plot training and validation data
plot(y,'m-')
grid on, hold on
plot(y(1:Nu),'b')
plot(y,'+k','markersize',2)
legend('validation data','training data','sampling markers','location','southwest')
xlabel('time (steps)')
ylabel('y')
ylim([-.5 1.5])
set(gcf,'position',[1 60 800 400])
% prepare training data
yt = con2seq(y(1:Nu)');
% prepare validation data
yv = con2seq(y(Nu+1:end)');
Page 36 of 91
Prepare input and target time series data for network training
% [Xs,Xi,Ai,Ts,EWs,shift] = preparets(net,Xnf,Tnf,Tf,EW)
%
% This function simplifies the normally complex and error prone task of
% reformatting input and target timeseries. It automatically shifts input
% and target time series as many steps as are needed to fill the initial
% input and layer delay states. If the network has open loop feedback,
% then it copies feedback targets into the inputs as needed to define the
% open loop inputs.
%
% net : Neural network
% Xnf : Non-feedback inputs
% Tnf : Non-feedback targets
%
Tf : Feedback targets
%
EW : Error weights (default = {1})
%
%
Xs : Shifted inputs
%
Xi : Initial input delay states
%
Ai : Initial layer delay states
%
Ts : Shifted targets
[Xs,Xi,Ai,Ts] = preparets(net,{},{},yt);
Train net
% train net with prepared training data
net = train(net,Xs,Ts,Xi,Ai);
% view trained net
view(net)
Page 37 of 91
Page 38 of 91
Page 39 of 91
Contents
Linear Regression
Exact RBFN
RBFN
GRNN
MLP
Data generator
Linear Regression
close all, clear all, clc, format compact
% generate data
[X,Xtrain,Ytrain,fig] = data_generator();
%--------------------------------% no hidden layers
net = feedforwardnet([]);
% % one hidden layer with linear transfer functions
% net = feedforwardnet([10]);
% net.layers{1}.transferFcn = 'purelin';
% set early stopping parameters
net.divideParam.trainRatio = 1.0; % training set [%]
net.divideParam.valRatio
= 0.0; % validation set [%]
net.divideParam.testRatio = 0.0; % test set [%]
% train a neural network
net.trainParam.epochs = 200;
net = train(net,Xtrain,Ytrain);
%--------------------------------% view net
view (net)
% simulate a network over complete input range
Y = net(X);
% plot network response
figure(fig)
plot(X,Y,'color',[1 .4 0])
legend('original function','available data','Linear regression','location','northwest')
Page 40 of 91
Exact RBFN
% generate data
[X,Xtrain,Ytrain,fig] = data_generator();
%--------------------------------% choose a spread constant
spread = .4;
% create a neural network
net = newrbe(Xtrain,Ytrain,spread);
%--------------------------------% view net
view (net)
% simulate a network over complete input range
Y = net(X);
% plot network response
figure(fig)
plot(X,Y,'r')
legend('original function','available data','Exact RBFN','location','northwest')
1.110223e-13.
Page 41 of 91
RBFN
% generate data
[X,Xtrain,Ytrain,fig] = data_generator();
%--------------------------------% choose a spread constant
spread = .2;
% choose max number of neurons
K = 40;
% performance goal (SSE)
goal = 0;
% number of neurons to add between displays
Ki = 5;
% create a neural network
net = newrb(Xtrain,Ytrain,goal,spread,K,Ki);
%--------------------------------% view net
view (net)
% simulate a network over complete input range
Y = net(X);
% plot network response
figure(fig)
plot(X,Y,'r')
legend('original function','available data','RBFN','location','northwest')
NEWRB,
NEWRB,
NEWRB,
NEWRB,
NEWRB,
NEWRB,
NEWRB,
NEWRB,
NEWRB,
neurons
neurons
neurons
neurons
neurons
neurons
neurons
neurons
neurons
=
=
=
=
=
=
=
=
=
0, MSE = 333.938
5, MSE = 47.271
10, MSE = 12.3371
15, MSE = 9.26908
20, MSE = 4.16992
25, MSE = 2.82444
30, MSE = 2.43353
35, MSE = 2.06149
40, MSE = 1.94627
Page 42 of 91
GRNN
% generate data
[X,Xtrain,Ytrain,fig] = data_generator();
%--------------------------------% choose a spread constant
Page 43 of 91
spread = .12;
% create a neural network
net = newgrnn(Xtrain,Ytrain,spread);
%--------------------------------% view net
view (net)
% simulate a network over complete input range
Y = net(X);
% plot network response
figure(fig)
plot(X,Y,'r')
legend('original function','available data','RBFN','location','northwest')
% view net
view (net)
% simulate a network over complete input range
Y = net(X);
% plot network response
figure(fig)
plot(X,Y,'r')
% Show RBFN centers
c = net.iw{1};
plot(c,zeros(size(c)),'rs')
legend('original function','available data','RBFN','centers','location','northwest')
%--------- trainbr --------------% Retrain a RBFN using Bayesian regularization backpropagation
net.trainFcn='trainbr';
net.trainParam.epochs = 100;
% perform Levenberg-Marquardt training with Bayesian regularization
net = train(net,Xtrain,Ytrain);
%--------------------------------% simulate a network over complete input range
Y = net(X);
% plot network response
figure(fig)
plot(X,Y,'m')
% Show RBFN centers
c = net.iw{1};
plot(c,ones(size(c)),'ms')
legend('original function','available data','RBFN','centers','RBFN + trainbr','new
centers','location','northwest')
Page 45 of 91
MLP
% generate data
[X,Xtrain,Ytrain,fig] = data_generator();
%--------------------------------% create a neural network
net = feedforwardnet([12 6]);
% set early stopping parameters
net.divideParam.trainRatio = 1.0; % training set [%]
net.divideParam.valRatio
= 0.0; % validation set [%]
net.divideParam.testRatio = 0.0; % test set [%]
% train a neural network
net.trainParam.epochs = 200;
net = train(net,Xtrain,Ytrain);
%--------------------------------% view net
view (net)
% simulate a network over complete input range
Y = net(X);
% plot network response
figure(fig)
plot(X,Y,'color',[1 .4 0])
legend('original function','available data','MLP','location','northwest')
Page 46 of 91
Data generator
type data_generator
Page 47 of 91
Page 48 of 91
Contents
1. Classification of XOR problem with an exact RBFN
2. Classification of XOR problem with a RBFN
3. Classification of XOR problem with a PNN
4. Classification of XOR problem with a GRNN
5. Bayesian regularization for RBFN
Page 49 of 91
Contents
Page 50 of 91
a spread constant
1;
a neural network
newrbe(P,T,spread);
% view network
view(net)
Page 51 of 91
8.881784e-14.
Spread
Num of neurons
Correct class
= 1.00
= 400
= 100.00 %
Page 52 of 91
Page 53 of 91
Page 54 of 91
Page 55 of 91
Contents
Create a RBFN
Page 56 of 91
Create a RBFN
% NEWRB algorithm
% The following steps are repeated until the network's mean squared error
% falls below goal:
% 1. The network is simulated
% 2. The input vector with the greatest error is found
% 3. A radbas neuron is added with weights equal to that vector
% 4. The purelin layer weights are redesigned to minimize error
% choose a spread constant
Page 57 of 91
spread = 2;
% choose max number of neurons
K
= 20;
% performance goal (SSE)
goal
= 0;
% number of neurons to add between displays
Ki
= 4;
% create a neural network
net
= newrb(P,T,goal,spread,K,Ki);
% view network
view(net)
NEWRB,
NEWRB,
NEWRB,
NEWRB,
NEWRB,
NEWRB,
neurons
neurons
neurons
neurons
neurons
neurons
=
=
=
=
=
=
0, MSE = 1
4, MSE = 0.302296
8, MSE = 0.221059
12, MSE = 0.193983
16, MSE = 0.154859
20, MSE = 0.122332
Page 58 of 91
Spread
Num of neurons
Correct class
= 2.00
= 20
= 99.50 %
Page 59 of 91
Page 60 of 91
Page 61 of 91
Page 62 of 91
Contents
Create a PNN
Page 63 of 91
Create a PNN
% choose
spread =
% create
net
=
a spread constant
.5;
a neural network
newpnn(P,ind2vec(T),spread);
% view network
view(net)
Page 64 of 91
Spread
Num of neurons
Correct class
= 0.50
= 400
= 100.00 %
Page 65 of 91
Page 66 of 91
Page 67 of 91
Page 68 of 91
Contents
Create a GRNN
Page 69 of 91
Create a GRNN
% choose
spread =
% create
net
=
a spread constant
.2;
a neural network
newgrnn(P,T,spread);
% view network
view(net)
Page 70 of 91
Spread
Num of neurons
Correct class
= 0.20
= 400
= 100.00 %
Page 71 of 91
Page 72 of 91
Page 73 of 91
Page 74 of 91
Contents
Create a RBFN
Page 75 of 91
Create a RBFN
% choose a spread constant
spread = .1;
% choose max number of neurons
K
= 10;
% performance goal (SSE)
goal
= 0;
% number of neurons to add between displays
Ki
= 2;
% create a neural network
net
= newrb(P,T,goal,spread,K,Ki);
% view network
Page 76 of 91
view(net)
NEWRB,
NEWRB,
NEWRB,
NEWRB,
NEWRB,
NEWRB,
neurons
neurons
neurons
neurons
neurons
neurons
=
=
=
=
=
=
0, MSE = 1
2, MSE = 0.928277
4, MSE = 0.855829
6, MSE = 0.798564
8, MSE = 0.742854
10, MSE = 0.690962
actual_spread =
8.3255
8.3255
8.3255
8.3255
8.3255
8.3255
8.3255
8.3255
8.3255
8.3255
Spread
Num of neurons
Correct class
= 0.10
= 10
= 79.50 %
Page 78 of 91
Page 79 of 91
xlabel('Sample No.')
spread_after_training =
2.9924
3.0201
0.7809
0.5933
2.6968
2.8934
2.2121
2.9748
2.7584
3.5739
Num of neurons = 10
Correct class
= 100.00 %
Page 82 of 91
Contents
Page 83 of 91
Page 84 of 91
Page 86 of 91
Page 87 of 91
Contents
Name
force
notes
target
Size
2000x100
1x3
2000x1
Bytes
1600000
222
16000
Class
Attributes
double
cell
double
Page 88 of 91
ps2 =
name:
xrows:
maxfrac:
yrows:
transform:
no_change:
Name
force
'processpca'
100
0.1000
2
[2x100 double]
0
Size
2000x100
Bytes
1600000
Class
Attributes
double
Page 89 of 91
force2
2000x2
32000
double
threshold = 0.5;
Y = double(Y > threshold)';
% find percentage of correct classifications
cc = 100*length(find(Y==target))/length(target);
fprintf('Correct classifications: %.1f [%%]\n', cc)
Page 91 of 91