Soft Computing Labmanual
Soft Computing Labmanual
Soft Computing Labmanual
Soft Computing
(ETCS – 456)
B.Tech. Programme
(IT+CSE)
Affiliated: GGSIPUniversity
1. Introduction
3. List of Experiments
Examination
5. Details of Experiments
7. References
2. H/W & S/W Requirements
RAM
1. Introduction
Soft Computing: Soft computing, according to Prof. Zadeh, is “an emerging
related disciplines that include fuzzy logic, artificial neural nets, genetic
algorithms, belief calculus, and some aspects of machine learning like inductive
Fuzzy Logic:
Fuzzy logic deals with fuzzy sets and logical connectives for modeling the
conventional sets, includes all elements of the universal set of the domain but
with varying membership values in the interval [0, 1]. It may be noted that a
conventional set contains its members with a value of membership equal to one
and disregards other elements of the universal set, for they have zero
membership. The most common operators applied to fuzzy sets are AND
OR have binary arguments, while negation has unary argument. The logic of
fuzzy sets was proposed by Zadeh, who introduced the concept in systems
Artificial neural nets (ANN) are electrical analogues of the biological neural
nets. Biological nerve cells, called Neurons, receive signals from neighboring
the cell body and transmit signals through a large and thick nerve fiber, called
function yields the sum of the weighted input excitation, while the non-linear
inhibiting function attempts to arrest the signal levels of the sum. The resulting
further. The ANN support both supervised and unsupervised types of machine
learning. The supervised learning algorithms realized with ANN have been
The unsupervised learning algorithms built with ANN, on the other hand, have
been applied in scheduling, knowledge acquisition , planning and analog to
Genetic Algorithms:
rests on the fundamental belief of the “survival of the fittest” in the process of
strings. The most common operators used in GA are crossover and mutation.
In step (a) of the above cycle, a few initial problem states are first identified.
The step (b) evolves new chromosomes through the process of crossover and
mutation. In step (c) a fixed number of better candidate states are selected from
the generated population. The above steps are repeated a finite number of times
OUTPUT:
y=
OUTPUT:
y=1 1 1 0 1 0 1 1 1 1 1 1 1 0
PERCEPTRON FOR SOLVING XOR GATE FOR 2 INPUTS:
p = [0 1 0 1;0 0 1 1];
t = [0 1 1 0];
net=newff(p,t,3,{},'traingd');
net.trainparam.show=50;
net.trainparam.lr=0.05;
net.trainparam.epochs=1000;
net.trainparam.goal=1e-5;
[net,tr] = train(net,p,t)
a = sim(net,p)
a=
OUTPUT:
2. A PROGRAM TO IMPLIMENT BACKPROPAGATION NETWORK
clear all;
disp('BACK PROPAGATION NETWORK');
v=[-0.7 -0.4; -0.2 0.3];
x=[0,1];
t=(1);
w=[0.5;0.1];
t1=0;
wb=-0.3;
vb=[0.4 0.6];
alpha=0.25;
e=1;
temp=0;
while(e<=3)
e
for i=1:2
for j=1:2
temp=temp+(v(j,i)*x(j));
end
zin(i)=temp+vb(i);
templ=e^(-zin(i));
fz(i)=(1/(1+templ));
z(i)=fz(i);
fdz(i)=fz(i)*(1-fz(i));
temp=0;
end
for k=1
for j=1:2
temp=temp+z(i)*w(j,k);
end
yin(k)=temp+wb(k);
fy(k)=(1/(1+e^-yin(k)));
y(k)=fy(k);
temp=0;
end
for k=1
fdy(k)=fy(k)*(1-fy(k));
delk(k)=(t(k)-y(k))*fdy(k);
end
for k=1
for j=1:2
dw(j,k)=alpha*delk(k)*z(j);
end
dwb(k)=alpha*delk(k);
end
for j=1:2
for k=1
delin(j)=delk(k) * w(j,k);
end
delj(j)=delin(j) * fdz(j);
end
w ,wb
for i=1:2
for j=1:2
dv(i,j)=alpha * delj(j) * x(i);
end
dvb(i)=alpha * delj(i);
end
for i=1:2
for j=1:2
v(i,j)=v(i,j)+dv(i,j);
end
vb(i)=vb(i) + dvb(i);
end
v ,vb
t(e)=e;
e=e+1;
end
for i=1:n
A(i)=input('Enter element');
end
for i=1:m
B(i)=input('Enter element');
end
for i=1:e
E(i)=input('Enter element');
end
disp('A U B');
AUB=union(A,B)
disp('A I B');
AIB=intersect(A,B)
disp('A - B');
AdB=setdiff(A,B)
disp('A complement');
Ac=setdiff(E,A)
OUTPUT:
for i=1:n
A(i)=input('Enter element');
end
for i=1:m
B(i)=input('Enter element');
end
p=input('Enter how many elements in set C');
disp('Enter elements for set C');
for i=1:p
C(i)=input('Enter element');
end
e=input('Enter how many elements in universal set E');
disp('Enter elements for set E');
for i=1:e
E(i)=input('Enter element');
end
disp('A U B');
AUB=union(A,B)
disp('B U A');
BUA=union(B,A)
disp('B U C');
BUC=union(B,C);
if(AUB==BUA)
disp('Commutative law for Union operation is satisfied');
else
disp('Commutative law for Union operation is not satisfied');
end
disp('A I B');
AIB=intersect(A,B)
disp('B I A');
BIA=intersect(B,A)
if(AIB==BIA)
disp('Commutative law for Intersection operation is satisfied');
else
disp('Commutative law for intersection operation is not satisfied');
end
disp('(A U B) U C');
AUBuC=union(AUB,C)
disp('A U (B U C)');
AuBUC=union(A,BUC)
if(AUBuC==AuBUC)
disp('Associative law for Union operation is satisfied');
else
disp('Associative law for Union operation is not satisfied');
end
disp('B I C');
BIC=intersect(B,C)
disp('A U C');
AUC=union(A,C)
AUBIC=union(A,BIC)
AUBIAUC=intersect(AUB,AUC)
if(AUBIC == AUBIAUC)
disp('Distibutive law is satisfied');
else
disp('Associative law is not satisfied');
end
disp('AUA');
AUA=union(A,A)
if(AUA == A)
disp('Idempotence law is satisfied');
else
disp('Idempotence law is not satisfied');
end
disp('(AUB) Complement');
AUBc=setdiff(E,AUB)
disp('A Complement');
Ac=setdiff(E,A)
disp('B Complement');
Bc=setdiff(E,B)
if(AUBc == AcIBc)
disp('Demorgans Law is satisfied');
else
disp('Demorgans Law is not satisfied');
end
OUTPUT:
5. PROGRAM TO PERFORM VARIOUS PRIMITIVE OPERATIONS
ON FUZZY SETS WITH DYNAMIC COMPONENT
OUTPUT:
6. PROGRAM TO VERIFY VARIOUS LAWS ASSOCIATED WITH
FUZZY SET
clear all;
clc;
disp('Fuzzy set properties');
a=[0 1 0.5 0.4 0.6];
b=[0 0.5 0.7 0.8 0.4];
c=[0.3 0.9 0.2 0 1];
phi=[0 0 0 0 0];
disp('Union of a and b');
au=max(a,b)
disp('Intersection of and b');
iab=min(a,b)
disp('union of b and a');
bu=max(b,a)
if(au==bu)
disp('commutative law is satisfied');
else
disp('commutative law is not satisfied');
end
disp('a U a');
idl=max(a,a)
a
if(idl==a)
disp('Idempotancy law is satisfied');
else
disp('Idempotancy law is not satisfied');
end
disp(' a U phi');
idtl=max(a,phi)
a
if(idtl==a)
disp('Identity law is satisfied');
else
disp('Identity law is not satisfied');
end
disp('complement of (a I b)');
for i=1:5
ciab(i)=1-iab(i);
end
ciab
disp('complement of a');
for i=1:5
ca(i)=1-a(i);
end
ca
disp('complement of b');
for i=1:5
cb(i)=1-b(i);
end
cb
disp('a complement U b complement');
dml=max(ca,cb)
if(dml == ciab)
disp('Demorgan law is satisfied');
else
disp('Demorgan law is not satisfied');
end
disp('Complement of complement of a');
for i=1:5
cca(i)=1-ca(i);
end
cca
a
if(a==cca)
disp('Involution law is satisfied');
else
disp('Involution law is not satisfied');
end
OUTPUT:
7. PROGRAM TO PERFORM CARTISIAN PRODUCT OVER TWO
GIVEN FUZZY SET
clear all;
n=input('Enter how many term in fuzzy set A');
m=input('Enter how many term in fuzzy set B');
disp('Enter membership value for A');
for i=1:n
a(i)=input('Enter value');
end
a
OUTPUT:
8. A PROGRAM TO PERFORM MAX-MIN COMPOSITION OF TWO
MATRICES OBTAINED FROM CARTISIAN PRODUCT
clear all;
n=input('Enter dimension of squre matrix A & B');
disp('Enter values for first Matrix R:');
for i=1:n
for j=1:n
R(i,j)=input('Enter value');
end
end
disp('Enter values for first Matrix S:');
for i=1:n
for j=1:n
S(i,j)=input('Enter value');
end
end
for i=1:n
for j=1:n
for k=1:n
c(k)=min(R(i,k),S(k,j));
end
RoS(i,j)=max(c);
end
end
RoS
OUTPUT:
9. A PROGRAM TO CONSTRUCT AND TEST
AUTOASSOCIATIVE NETWORK FOR INPUT VECTOR USING
HEBB RULE.
clear all;
disp('AUTO ASSOCIATIVE NETWORK-----HEBB RULE');
w=[0 0 0 0; 0 0 0 0; 0 0 0 0; 0 0 0 0];
s=[1 1 1 -1];
t=[1 1 1 -1];
ip=[1 -1 -1 -1];
disp('INPUT VECTOR');
s
for i=1:4
for j=1:4
w(i,j)=w(i,j)+(s(i)*t(j));
end
end
disp('WEIGHTS TO STORE THE GIVEN VECTOR IS');
w
disp('TESTING THE NET WITH VECTOR');
ip
yin=ip*w;
for i=1:4
if yin(i)>0
y(i)=1;
else
y(i)=-1;
end
end
y
if (y==s)
disp('PATTERN IS RECOGNIZED');
else
disp('PATTERN IS NOT RECOGNIZED');
end
OUTPUT:
p=[-1 -1 2 2 ; 0 5 0 5];
t=[-1 -1 1 1];
net=newff(minmax(p),[3,1], {'tansig','purelin'},'traingd');
net.trainParam.show=50;
net.trainParam.lr=0.05;
net.trainParam.epochs=300;
net.trainParam.goal=1e-5;
[net,tr]=train(net,p,t);
a=sim(net,p)
OUTPUT:
11. TO CREATE A MULTILAYER PERCEPTRON NETWORK
clear all;
x=0:0.06:2;
y=sin(x);
p=x;
t=y;
net=newff([0 2],[5,1], {'tansig','purelin'},'traingd');
net.trainParam.show=50;
net.trainParam.lr=0.05;
net.trainParam.epochs=500;
net.trainParam.goal=1e-3;
net1=train(net,p,t);
a=sim(net1,p)
OUTPUT:
12. Write a program to implement composition of Fuzzy and
Crisp sets.
clear all;
clc;
for i=1:2
r(i)=a(i)*b(i);
s(i)=b(i)*c(i);
end
r
s
irs=min(r,s)
disp('Crisp - composition of r and s using max-min composition');
crs=max(irs);
for i=1:2
prs(i)=r(i)*s(i);
end
prs
firs=min(r,s);
disp('Fuzzy composition of r and s using max-min composition');
frs=max(firs);
for i=1:2
fprs(i)=r(i)*s(i);
end
fprs
disp('Fuzzy composition of r and s using max-product composition');
fmprs=max(fprs)
Output :
a=
0.2000 0.6000
b= 0.3000 0.5000
c=
0.6000 0.7000
r=
0.0600 0.3000
s=
0.1800 0.3500
irs =
0.0600 0.3000
prs =
0.0108 0.1050
mprs =
0.1050
Fuzzy composition:
==================
fmprs = 0.1050
function z=quadratic(x)
z=(x*x+3*x+2);
net = newff(alphabet,targets,25);
for i=1:max_test
x = alphabet + randn(35,26)*noiselevel;
% TEST NETWORK 1
y = sim(net1,x);
yy = compet(y);
errors1 = errors1 + sum(sum(abs(yy-targets)))/2;
% TEST NETWORK 2
yn = sim(net2,x);
yyn = compet(yn);
errors2 = errors2 + sum(sum(abs(yyn-targets)))/2;
clf
plot(noise_range,network1*100,'--',noise_range,network2*100);
title('Percentage of Recognition Errors');
xlabel('Noise Level');
ylabel('Network 1 _ _ Network 2 ___');
options = gaoptimset('Generations',300);
record=[];
for n=0:.1:1
options = gaoptimset(options,'CrossoverFraction', n);
[xfval]=ga(@rastriginsfcn, 2,[],[],[],[],[],[],[],options);
x
fval
record = [record; fval];
end
plot(0:.1:1, record);
xlabel('Crossover Fraction');
ylabel('fval');
Output
Optimization terminated: average change in the fitness value less than options.TolFun.
x=
0.0129 0.0121
Fval =
0.0619
Optimization terminated: average change in the fitness value less than options.TolFun.
x=
-0.0295 -0.0520
fval =
0.7045
Optimization terminated: average change in the fitness value less than options.TolFun.
x=
-0.0328 -0.0241
fval =
0.3282
16.6. VIVA-QUESTIONS
1. Discuss different modules of neurons.
2. What is memory based learning? Discuss about it.
3. What is bias? Compare weights and bias.
4. Write different applications of neural networks.
5. Discuss how neural networks help in solving AI problems.
6. Differentiate between feed forward and feedback networks.
7. Draw the Mcculloch-Pitts(MP) neuron model. Write the main differences between MP and
perceptron model.
8. Which signal function is known as maximum-entropy signal function? Discuss it in brief.
9. A recurrent network has 3 sources nodes, 2 hidden neurons and 4 output neurons. Construct
an architectural graph that described such a network.
10. What is linear associator ? Explain with the help of Hebbian rule.
11. State and prove Perceptron Convergence Theorem
12. Discuss learning rate annealing techniques.
13. What is Adaptive Filtering problem?
14. Discuss Bayes Classifier System in pattern classification.
15. What is multi-layer perceptron?
16. Explain feature extraction procedure in a multilayer perceptron.
17. What is backpropagation and its limitations.
18. Why backpropagation learning is also called generalized Delta Rule?
19. Why convergence is not guaranteed for the backpropagation learning algorithm?
20. Distinguish between multilayer perceptron and a general multilayer feedforward neural
network.
21. Explain fuzzy set , fuzzy system and fuzzy logic.
22. Develop a reasonable membership function for the following fuzzy sets based on height
measured in centimeters
a. (i) “tall’ (ii) “short” (iii) “not short”