Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

Soft Computing Labmanual

Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 40

LAB MANUAL

Soft Computing
(ETCS – 456)

B.Tech. Programme

(IT+CSE)

Maharaja Surajmal Institute of Technology

Affiliated: GGSIPUniversity

C-4, JanakPuri, New Delhi - 110058


Contents

1. Introduction

2. Hardware and Software requirements

3. List of Experiments

4. Marking Scheme for the Practical Lab

Examination

5. Details of Experiments

6. Expected viva voce Questions

7. References
2. H/W & S/W Requirements

Software requirements: MATLAB R2011a

Operating System: Windows - XP

Hardware requirements: P-IV 2.8 GHz

Intel 845 MB/40 GB HDD/ 512 MB

RAM
1. Introduction
Soft Computing: Soft computing, according to Prof. Zadeh, is “an emerging

approach to computing, which parallels the remarkable ability of the human

mind to reason and learn in an environment of uncertainty and imprecision” .It,

in general, is a collection of computing tools and techniques, shared by closely

related disciplines that include fuzzy logic, artificial neural nets, genetic

algorithms, belief calculus, and some aspects of machine learning like inductive

logic programming. These tools are used independently as well as jointly

depending on the type of the domain of applications.

Fuzzy Logic:

Fuzzy logic deals with fuzzy sets and logical connectives for modeling the

human-like reasoning problems of the real world. A fuzzy set, unlike

conventional sets, includes all elements of the universal set of the domain but

with varying membership values in the interval [0, 1]. It may be noted that a

conventional set contains its members with a value of membership equal to one

and disregards other elements of the universal set, for they have zero

membership. The most common operators applied to fuzzy sets are AND

(minimum), OR (maximum) and negation (complementation), where AND and

OR have binary arguments, while negation has unary argument. The logic of

fuzzy sets was proposed by Zadeh, who introduced the concept in systems

theory, and later extended it for approximate reasoning in expert systems.


Artificial Neural Nets:

Artificial neural nets (ANN) are electrical analogues of the biological neural

nets. Biological nerve cells, called Neurons, receive signals from neighboring

neurons or receptors through dendrites, process the received electrical pulses at

the cell body and transmit signals through a large and thick nerve fiber, called

an axon. The electrical model of a typical biological neuron consists of a linear

activator, followed by a non-linear inhibiting function. The linear activation

function yields the sum of the weighted input excitation, while the non-linear

inhibiting function attempts to arrest the signal levels of the sum. The resulting

signal, produced by an electrical neuron, is thus bounded (amplitude limited).

An artificial neural net is a collection of such electrical neurons connected in

different topology. The most common application of an artificial neural net is in

machine learning. In a learning problem, the weights and / or non-linearities in

an artificial neural net undergo an adaptation cycle. The adaptation cycle is

required for updating these parameters of the network, until a state of

equilibrium are reached, following which the parameters no longer change

further. The ANN support both supervised and unsupervised types of machine

learning. The supervised learning algorithms realized with ANN have been

successfully applied in control, automation, and robotics and computer vision.

The unsupervised learning algorithms built with ANN, on the other hand, have
been applied in scheduling, knowledge acquisition , planning and analog to

digital conversion of data.

Genetic Algorithms:

A genetic algorithm (GA) is a stochastic algorithm that mimics the natural

process of biological evolution. It follows the principle of Darwinism, which

rests on the fundamental belief of the “survival of the fittest” in the process of

natural selection of species. GAs find extensive applications in intelligent

search, machine learning and optimization problems. The problem states in a

GA are denoted by chromosomes, which are usually represented by binary

strings. The most common operators used in GA are crossover and mutation.

The evolutionary cycle in a GA consists of the following three sequential steps

a) Generation of population (problem states represented by chromosomes).

b) Genetic evolution through crossover followed by mutation.

c) Selection of better candidate states from the generated population.

In step (a) of the above cycle, a few initial problem states are first identified.

The step (b) evolves new chromosomes through the process of crossover and

mutation. In step (c) a fixed number of better candidate states are selected from

the generated population. The above steps are repeated a finite number of times

for obtaining a solution for the given problem.


3. List of Experiments (Soft computing) (ETCS-456)

1. Write a program to perform perceptron network for solving Logical AND


Problem.
2. Write a program to perform Back propagation Network.
3. Write a program to perform various primitive operations on classic sets.
4. Write a program to implement various laws associated with classical set.
5. Write a program to implement various primitive operations on fuzzy sets.
6. Write a program to implement various laws associated with Fuzzy sets.
7. Write a program to perform Cartesian product over two given Fuzzy sets.
8. Write a program to perform MIN MAX composition of two matrices
obtained from Cartesian product over two given Fuzzy sets.
9. Write a program to Construct and test Auto Associative Network for input
vector using HEBB Rule.
10. Write a program to create Feed Forward Network and perform batch
training.
11.Write a program to create Multilayer Perceptron Network.
12. Write a program to implement composition of Fuzzy and Crisp sets.
13. Use GATOOL and minimize the quadratic function f(x)=x*x+3x+2 in
rage -6 to 0 .
14.To create and train a small feed forward neural network for character
recognition (with and without noise) where 26 characters are given as 5*7
bitmap matrix for each letter. Discuss the network and training.
15.To minimize a multimodal function of two variables of suitable range
through Genetic Algorithm.
4. Marking Scheme for the Practical Lab Exam
There will be two practical exams in each semester.
 Internal Practical Exam
 External Practical Exam
Internal Practical Exam:
It is taken by the concerned Faculty member of the batch.
Marking Scheme :
Total Marks: 40
Division of 40 marks is as follows:
1. Regularity: 30
 Weekly performance in the lab
 Attendance
 File
2. Viva Voce: 10
NOTE: For the regularity, marks are awarded to the student out of 10 for
each experiment performed in the lab and at the end the average marks are
giving out of 30.
External Practical Exam:
It is taken by the concerned faculty member of the batch and by an external
examiner. In this exam student needs to perform the experiment allotted at
the time of the examination, a sheet will be given to the student in which
some details asked by the examiner needs to be written and at last viva will
be taken by the external examiner.
Marking Scheme:
Total Marks: 60
Division of 60 marks is as follows:
a. Evaluation of the answer sheet 20
b. Viva Voce 15
c. Experiment performance 15
d. File submitted 10
NOTE:
 Internal marks + External marks = Total marks given to the
students
(40 marks) + (60 marks) (100 marks)
 Experiments given to perform can be from any section of the lab.
5. Detail of Experiments

1. PROGRAM TO IMPLEMENT PERCEPTRON NETWORK

PERCEPTRON FOR SOLVING AND GATE:

net=newp([0 1;0 1;0 1],1);


p={[0;0;0] [0;0;1] [0;1;0] [0;1;1] [1;0;0] [1;0;1] [1;1;0] [1;1;1]};
t={0 0 0 0 0 0 0 1};
net.adaptParam.passes=10;
net=adapt(net,p,t);
y = sim(net,p)

OUTPUT:
y=

[0] [0] [0] [0] [0] [0] [0] [1]

PERCEPTRON FOR SOLVING OR GATE:

p=[0 1 1 0 1 0 1 1 0 1 0 1 1 1 0;1 1 0 0 1 0 1 0 1 1 1 0 1 0 0];


t=[1 1 1 0 1 0 1 1 1 1 1 1 1 1 0];
net=newp(p,t);
net=init(net);
y=sim(net,p)
net.trainParam.epochs=20;
net=train(net,p,t);
y=sim(net,p)

OUTPUT:

y=1 1 1 0 1 0 1 1 1 1 1 1 1 0
PERCEPTRON FOR SOLVING XOR GATE FOR 2 INPUTS:
p = [0 1 0 1;0 0 1 1];
t = [0 1 1 0];
net=newff(p,t,3,{},'traingd');
net.trainparam.show=50;
net.trainparam.lr=0.05;
net.trainparam.epochs=1000;
net.trainparam.goal=1e-5;
[net,tr] = train(net,p,t)
a = sim(net,p)
a=

0.0042 0.9983 0.9970 0.0032

OUTPUT:
2. A PROGRAM TO IMPLIMENT BACKPROPAGATION NETWORK

clear all;
disp('BACK PROPAGATION NETWORK');
v=[-0.7 -0.4; -0.2 0.3];
x=[0,1];
t=(1);
w=[0.5;0.1];
t1=0;
wb=-0.3;
vb=[0.4 0.6];
alpha=0.25;
e=1;
temp=0;
while(e<=3)
e
for i=1:2
for j=1:2
temp=temp+(v(j,i)*x(j));
end
zin(i)=temp+vb(i);
templ=e^(-zin(i));
fz(i)=(1/(1+templ));
z(i)=fz(i);
fdz(i)=fz(i)*(1-fz(i));
temp=0;
end

for k=1
for j=1:2
temp=temp+z(i)*w(j,k);
end
yin(k)=temp+wb(k);
fy(k)=(1/(1+e^-yin(k)));
y(k)=fy(k);
temp=0;
end

for k=1
fdy(k)=fy(k)*(1-fy(k));
delk(k)=(t(k)-y(k))*fdy(k);
end

for k=1
for j=1:2
dw(j,k)=alpha*delk(k)*z(j);
end
dwb(k)=alpha*delk(k);
end

for j=1:2
for k=1
delin(j)=delk(k) * w(j,k);
end
delj(j)=delin(j) * fdz(j);
end
w ,wb
for i=1:2
for j=1:2
dv(i,j)=alpha * delj(j) * x(i);
end
dvb(i)=alpha * delj(i);
end

for i=1:2
for j=1:2
v(i,j)=v(i,j)+dv(i,j);
end
vb(i)=vb(i) + dvb(i);
end
v ,vb
t(e)=e;
e=e+1;
end

3. A PROGRAM TO IMPLEMENT VARIOUS PRIMITIVE


OPERATIONS OF CLASSICAL SET

n=input('Enter how many elements in set A');


disp('Enter elements for set A');

for i=1:n
A(i)=input('Enter element');
end

m=input('Enter how many elements in set B');


disp('Enter elements for set B');

for i=1:m
B(i)=input('Enter element');
end

e=input('Enter how many elements in universal set E');


disp('Enter elements for set E');

for i=1:e
E(i)=input('Enter element');
end
disp('A U B');
AUB=union(A,B)
disp('A I B');
AIB=intersect(A,B)
disp('A - B');
AdB=setdiff(A,B)
disp('A complement');
Ac=setdiff(E,A)

OUTPUT:

4. A PROGRAM TO PERFORM VERIOUS LAWS ASSOCIATED


WITH CLASSICAL SET
n=input('Enter how many elements in set A');
disp('Enter elements for set A');

for i=1:n
A(i)=input('Enter element');
end

m=input('Enter how many elements in set B');


disp('Enter elements for set B');

for i=1:m
B(i)=input('Enter element');
end
p=input('Enter how many elements in set C');
disp('Enter elements for set C');
for i=1:p
C(i)=input('Enter element');
end
e=input('Enter how many elements in universal set E');
disp('Enter elements for set E');
for i=1:e
E(i)=input('Enter element');
end
disp('A U B');
AUB=union(A,B)
disp('B U A');
BUA=union(B,A)
disp('B U C');
BUC=union(B,C);
if(AUB==BUA)
disp('Commutative law for Union operation is satisfied');
else
disp('Commutative law for Union operation is not satisfied');
end
disp('A I B');
AIB=intersect(A,B)
disp('B I A');
BIA=intersect(B,A)
if(AIB==BIA)
disp('Commutative law for Intersection operation is satisfied');
else
disp('Commutative law for intersection operation is not satisfied');
end
disp('(A U B) U C');
AUBuC=union(AUB,C)
disp('A U (B U C)');
AuBUC=union(A,BUC)
if(AUBuC==AuBUC)
disp('Associative law for Union operation is satisfied');
else
disp('Associative law for Union operation is not satisfied');
end
disp('B I C');
BIC=intersect(B,C)
disp('A U C');
AUC=union(A,C)
AUBIC=union(A,BIC)
AUBIAUC=intersect(AUB,AUC)
if(AUBIC == AUBIAUC)
disp('Distibutive law is satisfied');
else
disp('Associative law is not satisfied');
end

disp('AUA');
AUA=union(A,A)
if(AUA == A)
disp('Idempotence law is satisfied');
else
disp('Idempotence law is not satisfied');
end

disp('Checking Demorgans Law');

disp('(AUB) Complement');
AUBc=setdiff(E,AUB)
disp('A Complement');
Ac=setdiff(E,A)

disp('B Complement');
Bc=setdiff(E,B)

disp('A Complement I B Complement');


AcIBc=intersect(Ac,Bc)

if(AUBc == AcIBc)
disp('Demorgans Law is satisfied');
else
disp('Demorgans Law is not satisfied');
end

OUTPUT:
5. PROGRAM TO PERFORM VARIOUS PRIMITIVE OPERATIONS
ON FUZZY SETS WITH DYNAMIC COMPONENT

u=input('Enter the first fuzzy set A');


v=input('Enter the first fuzzy set B');
disp('union of A and B');
w=max(u,v);
disp('Intersection of A and B');
p=min(u,v);
[m]=size(u);
disp('Complement of A');
q1=ones(m)-u;
[n]=size(v);
disp('comlement of B');
q2=ones(n)-v;

OUTPUT:
6. PROGRAM TO VERIFY VARIOUS LAWS ASSOCIATED WITH
FUZZY SET

clear all;
clc;
disp('Fuzzy set properties');
a=[0 1 0.5 0.4 0.6];
b=[0 0.5 0.7 0.8 0.4];
c=[0.3 0.9 0.2 0 1];
phi=[0 0 0 0 0];
disp('Union of a and b');
au=max(a,b)
disp('Intersection of and b');
iab=min(a,b)
disp('union of b and a');
bu=max(b,a)
if(au==bu)
disp('commutative law is satisfied');
else
disp('commutative law is not satisfied');
end

disp('union of b and c');


cu=max(b,c)
disp('a U (b U c)');
acu=max(a,cu)
disp('(a U b) U c');
auc=max(au,c)
if(acu == auc)
disp('Associative law is satisfied');
else
disp('Associative law is not satisfied');
end

disp('Intersection of b and c');


ibc=min(b,c)
disp('a U (b I c)');
dls=max(a,ibc)
disp('union of a and c');
uac=max(a,c)
disp('(a U b) I (a U c)');
drs=min(au,uac)
if(dls==drs)
disp('Distributive law is satisfied');
else
disp('Distributive law is not satisfied');
end

disp('a U a');
idl=max(a,a)
a
if(idl==a)
disp('Idempotancy law is satisfied');
else
disp('Idempotancy law is not satisfied');
end
disp(' a U phi');
idtl=max(a,phi)
a
if(idtl==a)
disp('Identity law is satisfied');
else
disp('Identity law is not satisfied');
end
disp('complement of (a I b)');
for i=1:5
ciab(i)=1-iab(i);
end
ciab
disp('complement of a');
for i=1:5
ca(i)=1-a(i);
end
ca
disp('complement of b');
for i=1:5
cb(i)=1-b(i);
end
cb
disp('a complement U b complement');
dml=max(ca,cb)
if(dml == ciab)
disp('Demorgan law is satisfied');
else
disp('Demorgan law is not satisfied');
end
disp('Complement of complement of a');
for i=1:5
cca(i)=1-ca(i);
end
cca
a
if(a==cca)
disp('Involution law is satisfied');
else
disp('Involution law is not satisfied');
end

OUTPUT:
7. PROGRAM TO PERFORM CARTISIAN PRODUCT OVER TWO
GIVEN FUZZY SET
clear all;
n=input('Enter how many term in fuzzy set A');
m=input('Enter how many term in fuzzy set B');
disp('Enter membership value for A');
for i=1:n
a(i)=input('Enter value');
end
a

disp('Enter membership value for B');


for i=1:m
b(i)=input('Enter value');
end
b
for i=1:n
for j=1:m
aXb(i,j)=min(a(i),b(j));
end
end
aXb

OUTPUT:
8. A PROGRAM TO PERFORM MAX-MIN COMPOSITION OF TWO
MATRICES OBTAINED FROM CARTISIAN PRODUCT
clear all;
n=input('Enter dimension of squre matrix A & B');
disp('Enter values for first Matrix R:');
for i=1:n
for j=1:n
R(i,j)=input('Enter value');
end
end
disp('Enter values for first Matrix S:');
for i=1:n
for j=1:n
S(i,j)=input('Enter value');
end
end
for i=1:n
for j=1:n
for k=1:n
c(k)=min(R(i,k),S(k,j));
end
RoS(i,j)=max(c);
end
end
RoS
OUTPUT:
9. A PROGRAM TO CONSTRUCT AND TEST
AUTOASSOCIATIVE NETWORK FOR INPUT VECTOR USING
HEBB RULE.
clear all;
disp('AUTO ASSOCIATIVE NETWORK-----HEBB RULE');
w=[0 0 0 0; 0 0 0 0; 0 0 0 0; 0 0 0 0];
s=[1 1 1 -1];
t=[1 1 1 -1];
ip=[1 -1 -1 -1];
disp('INPUT VECTOR');
s
for i=1:4
for j=1:4
w(i,j)=w(i,j)+(s(i)*t(j));
end
end
disp('WEIGHTS TO STORE THE GIVEN VECTOR IS');
w
disp('TESTING THE NET WITH VECTOR');
ip
yin=ip*w;
for i=1:4
if yin(i)>0
y(i)=1;
else
y(i)=-1;
end
end
y
if (y==s)
disp('PATTERN IS RECOGNIZED');
else
disp('PATTERN IS NOT RECOGNIZED');
end
OUTPUT:

10. A PROGRAM TO CREATE A FEED FORWARD NETWORK


AND PERFORM BATCH TRAINING

p=[-1 -1 2 2 ; 0 5 0 5];
t=[-1 -1 1 1];
net=newff(minmax(p),[3,1], {'tansig','purelin'},'traingd');
net.trainParam.show=50;
net.trainParam.lr=0.05;
net.trainParam.epochs=300;
net.trainParam.goal=1e-5;
[net,tr]=train(net,p,t);
a=sim(net,p)

OUTPUT:
11. TO CREATE A MULTILAYER PERCEPTRON NETWORK

clear all;
x=0:0.06:2;
y=sin(x);
p=x;
t=y;
net=newff([0 2],[5,1], {'tansig','purelin'},'traingd');
net.trainParam.show=50;
net.trainParam.lr=0.05;
net.trainParam.epochs=500;
net.trainParam.goal=1e-3;
net1=train(net,p,t);
a=sim(net1,p)
OUTPUT:
12. Write a program to implement composition of Fuzzy and
Crisp sets.

MATLAB Source code :

%Write a program to implement composition of Fuzzy and Crisp sets


%The operation executed on two compatible binary relations to get a
%single binary relation is called composition

clear all;
clc;

disp('Composition of crisp relations');


a=[0.2 0.6]
b=[0.3 0.5]
c=[0.6 0.7]

for i=1:2
r(i)=a(i)*b(i);
s(i)=b(i)*c(i);
end
r
s
irs=min(r,s)
disp('Crisp - composition of r and s using max-min composition');
crs=max(irs);

for i=1:2
prs(i)=r(i)*s(i);
end
prs

disp('Crisp composition of r and s using max product composition');


mprs=max(prs);
mprs
disp('Fuzzy composition:');
disp('==================');

firs=min(r,s);
disp('Fuzzy composition of r and s using max-min composition');
frs=max(firs);

for i=1:2
fprs(i)=r(i)*s(i);
end
fprs
disp('Fuzzy composition of r and s using max-product composition');
fmprs=max(fprs)
Output :

Composition of crisp relations

a=

0.2000 0.6000
b= 0.3000 0.5000

c=

0.6000 0.7000

r=

0.0600 0.3000

s=

0.1800 0.3500

irs =

0.0600 0.3000

Crisp - composition of r and s using max-min composition

prs =

0.0108 0.1050

Crisp composition of r and s using max product composition

mprs =

0.1050

Fuzzy composition:

==================

Fuzzy composition of r and s using max-min composition

fprs = 0.0108 0.1050

Fuzzy composition of r and s using max-product composition

fmprs = 0.1050

13.Use GATOOL and minimize the quadratic function


f(x)=x*x+3x+2 in rage -6 to 0 .
MATLAB Source code: (Used GATOOL)

function z=quadratic(x)
z=(x*x+3*x+2);

Output and GRAPHs:

Range is mentioned in the tool from -6 to 0 .


14. To create and train a small feed forward neural network for character
recognition (with and without noise) where 26 characters are given as 5*7
bitmap matrix for each letter. Discuss the network and training.
[alphabet,targets] = prprob;

net = newff(alphabet,targets,25);

% TRAINING THE NETWORK WITHOUT NOISE


net1 = net;
net1.divideFcn = '';
[net1,tr] = train(net1,alphabet,targets);

% TRAINING THE NETWORK WITH NOISE


numNoisy = 10;
alphabet2 = [alphabet repmat(alphabet,1,numNoisy)+randn(35,26*numNoisy)*0.2];
targets2 = [targets repmat(targets,1,numNoisy)];
net2 = train(net,alphabet2,targets2);

% SET TESTING PARAMETERS


noise_range = 0:.05:.5;
max_test = 100;
network1 = [];
network2 = [];

% PERFORM THE TEST


fornoiselevel = noise_range
fprintf('Testing networks with noise level of %.2f.\n',noiselevel);
Testing networks with noise level of 0.00.
errors1 = 0;
errors2 = 0;

for i=1:max_test
x = alphabet + randn(35,26)*noiselevel;

% TEST NETWORK 1
y = sim(net1,x);
yy = compet(y);
errors1 = errors1 + sum(sum(abs(yy-targets)))/2;

% TEST NETWORK 2
yn = sim(net2,x);
yyn = compet(yn);
errors2 = errors2 + sum(sum(abs(yyn-targets)))/2;

clf
plot(noise_range,network1*100,'--',noise_range,network2*100);
title('Percentage of Recognition Errors');
xlabel('Noise Level');
ylabel('Network 1 _ _ Network 2 ___');

Output and graph :


15. To minimize a multimodal function of two variables of suitable range
through Genetic Algorithm.

MATLAB Source code :

% multimodal function is rastrigin function

% create .m file containing following code

options = gaoptimset('Generations',300);
record=[];
for n=0:.1:1
options = gaoptimset(options,'CrossoverFraction', n);

LB = [-10 -10]; % Lower bound


UB = [10 10]; % Upper bound

[xfval]=ga(@rastriginsfcn, 2,[],[],[],[],[],[],[],options);
x
fval
record = [record; fval];
end
plot(0:.1:1, record);
xlabel('Crossover Fraction');
ylabel('fval');

Output

Optimization terminated: average change in the fitness value less than options.TolFun.

x=

0.0129 0.0121

Fval =

0.0619

Optimization terminated: average change in the fitness value less than options.TolFun.

x=

-0.0295 -0.0520

fval =

0.7045

Optimization terminated: average change in the fitness value less than options.TolFun.

x=

-0.0328 -0.0241

fval =

0.3282
16.6. VIVA-QUESTIONS
1. Discuss different modules of neurons.
2. What is memory based learning? Discuss about it.
3. What is bias? Compare weights and bias.
4. Write different applications of neural networks.
5. Discuss how neural networks help in solving AI problems.
6. Differentiate between feed forward and feedback networks.
7. Draw the Mcculloch-Pitts(MP) neuron model. Write the main differences between MP and
perceptron model.
8. Which signal function is known as maximum-entropy signal function? Discuss it in brief.
9. A recurrent network has 3 sources nodes, 2 hidden neurons and 4 output neurons. Construct
an architectural graph that described such a network.
10. What is linear associator ? Explain with the help of Hebbian rule.
11. State and prove Perceptron Convergence Theorem
12. Discuss learning rate annealing techniques.
13. What is Adaptive Filtering problem?
14. Discuss Bayes Classifier System in pattern classification.
15. What is multi-layer perceptron?
16. Explain feature extraction procedure in a multilayer perceptron.
17. What is backpropagation and its limitations.
18. Why backpropagation learning is also called generalized Delta Rule?
19. Why convergence is not guaranteed for the backpropagation learning algorithm?
20. Distinguish between multilayer perceptron and a general multilayer feedforward neural
network.
21. Explain fuzzy set , fuzzy system and fuzzy logic.
22. Develop a reasonable membership function for the following fuzzy sets based on height
measured in centimeters
a. (i) “tall’ (ii) “short” (iii) “not short”

23. Write note on Fuzzy rule generation.


24. Give a comparitive analysis for neural and fuzzy systems for representation of structural
knowledge.
25. Explain Fuzzy Neural Net.
26. Following are some statements-
a. Sophia is tall
b. Sheena is short.
c. Chinese are not very tall.
d. Mostly Indians are of brown complexion.
e. Generally, 5’8” above are called tall.
27. Define fuzzy set which can be used to represent the list of above statement.
28. What is Crisp set?
29. Distinguish between fuzzy set and classical set.
30. Explain difference between Fuzzy Theory and Probability Theory.
31. Explain decision making in fuzzy system.
32. Explain fuzzy-rule based system.
33. What is membership functions?
34. What are type of defuzzification method?
35. What is meant by Rank ordering?
36. What are all the membership value assignment?
37. Explain closed loop control system using fuzzy logic.
38. Compare fuzzy control system and neural network , control system.
39. Explain the problems in control system design.
40. Write the structures of fuzzy production rule system.
41. Write note on (i) Linear Genetic Programming
42. What are advantage of evolutionary algorithm
43. Explain and draw Flow Chart of basic genetic algorithm iteration.
44. Explain Roulette Wheel Selection.
45. What are two genetic operators.
46. Explain Schema Theorem.
47. Write applications of Genetic Algorithm.
48. Explain encoding & decoding of Genetic Algorithm.
49. Explain Generation cycle.
50. Write procedure of Genetic Algorithm
7. REFERENCES

1. J. A. Anderson, “An Introduction to Neural Networks”, PHI, 1999.


2. Hertz J. Krogh, R.G. Palmer, “Introduction to the Theory of Neural
Computation”, Addison-Wesley, California, 1991.
3. G.J. Klir & B. Yuan, “Fuzzy Sets & Fuzzy Logic”, PHI, 1995.
4. “Neural Networks-A Comprehensive Foundations”, Prentice-Hall
International, New Jersey, 1999.
5. J. A. Freeman, D.M. Skapura, “Neural Networks: Algorithms, Applications
and Programming Techniques”, Addison Wesley, Reading, Mass, (1992).
6. Melanie Mitchell, “An Introduction to Genetic Algorithm”, PHI, 1998.
7. MATLAB Tutorials.

You might also like