Soft Computing PPT Module1
Soft Computing PPT Module1
Soft Computing PPT Module1
Introduction to Soft Computing. Difference between Hard Computing & Soft Computing.
Applications of Soft Computing. Artificial Neurons Vs Biological Neurons. Basic models
of artificial neural networks – Connections, Learning, Activation Functions. McCulloch
and Pitts Neuron. Hebb network.
Text Books
1. S.N.Sivanandam and S.N. Deepa, Principles of Soft Computing , 2ndEdition, John
Wiley & Sons.
2. Kalyanmoy Deb, Multi-objective Optimization using Evolutionary Algorithms, 1st
Edition, John Wiley & Sons.
ReferenceBooks
1. Timothy J Ross, Fuzzy Logic with Engineering Applications, John Wiley & Sons,
2016.
2. T.S.Rajasekaran, G.A.Vijaylakshmi Pai “Neural Networks, Fuzzy Logic & Genetic
Algorithms Synthesis and Applications”, Prentice-Hall India.
3. Simon Haykin, “Neural Networks- A Comprehensive Foundation”, 2/e, Pearson
Education.
4. Zimmermann H. J, “Fuzzy Set Theory & Its Applications”, Allied Publishers Ltd.
INTRODUCTION TO SOFT COMPUTING
• The idea of soft computing was initiated in 1981, by Lotfi A Zadeh. The role
model for soft computing is human mind.
• Soft computing is a term used in computer science to refer to problems, whose
solutions are unpredictable, uncertain and between 0 and 1.
• Designed to model solutions to real world problems, which are not modeled or
too difficult to model mathematically.
PROBLEM SOLVING TECHNIQUES
Two Techniques:
• Hard Computing: deals with precise models where accurate solutions are
achieved quickly.
• Soft Computing:deals with approximate models and gives solutions to
complex problems.
Fig 1.1:Problem Solving Technologies
soft computing
• soft computing deals with approximate models and gives solution to complex
problems.
• Introduced by Professor Lotfi Zadeh .
• The ultimate goal is to be able to emulate the human mind as closely as
possible.
• Soft computing involves a combination of
Genetic Algorithms
neural networks
fuzzy logic.
APPLICATION OF SOFT COMPUTING
• History
• 1943-McCulloch & Pitts are generally recognised as the designers of the first neural
network
• 1949-First learning rule (Hebb)
• 1969-Minsky & Papert - perceptron limitation - Death of ANN
• 1980’s - Re-emergence of ANN - multi-layer networks
Brain and Machine
• The Brain
– Pattern Recognition
– Association
– Complexity
– Noise Tolerance
• The Machine
– Calculation
– Precision
– Logic
Brain vs. Computer –
Comparison Between Biological Neuron and Artificial
Neuron
1. Speed:
• The cycle time of execution
in the ANN is of few nanoseconds
biological neuron it is of a few milliseconds.
• Hence, the artificial neuron modeled using a computer is more faster.
2. Processing:
• Both the biological neuron and the artificial neuron can perform massive
parallel operations simultaneously.
• but, in general, the ANN process is faster than that of the brain.
3. Size and complexity:
• The total number of neurons in the brain is about and the total number of
interconnections is about
• Hence, the complexity of the brain is comparatively higher,
i.e. the computational work takes places not only in the brain cell body,
but also in axon, synapse, etc.
• The size and complexity of an ANN is based on the chosen application and the
network designer.
• The size and complexity of a biological neuron is more than that of an
artificial neuron.
4. Storage capacity (memory):
• A disadvantage related to brain is that sometimes its memory may fail to recollect
the stored information
• The biological neuron possesses fault tolerant capability whereas the artificial
neuron has no fault tolerance.
• Even when some cells die, the human nervous system appears to be
performing with the same efficiency.
6.Control mechanism:
• In an artificial neuron modeled using a computer, there is a control unit
present in Central Processing Unit, which can transfer and control precise
scalar values from unit to unit,
• but there is no such control unit for monitoring in the brain.
• The strength of a neuron in the brain depends on the active chemicals
present and whether neuron connections are strong or weak.
• ANN possesses
simpler interconnections and
free from chemical actions
• Thus, the control mechanism of an artificial neuron is very simple compared
to that of a biological neuron.
ANN possesses the following characteristics:
(Yin)
(y)
Artificial neuron
• The above simple neuron net architecture, the net i/p calculated as:
x1 and x2 are the activations of i/p neurons X1 and X2
ie, o/p of i/p signals.
• The o/p y of the o/p neuron Y can be obtained by applying activation over the net
i/p.
• The function of net i/p:
•
Types of Layers
3. Activation functions
Synaptic interconnections
• An ANN consists of a set of highly interconnected processing elements
(neurons) such that
each processing element output is found to be connected through weights
to the other processing elements or to itself.
• The arrangement of neuron to form layers and the connection pattern formed
within and between layers is called the network architecture.
• five basic types of neuron connection architectures.
Synaptic interconnections (architecture)
1.Supervised learning
2.Unsupervised learning
3.Reinforcement learning
Learning
Parameter learning
Structure learning
it learns by itself, that is, a child fish learns to swim by itself, it is not taught
by its mother.
• The output here remains the same as input. The input layer uses the identity
activation function.
2. Binary step function
• This function can be defined as
• Threshold is a set value based upon which the final output of the network
may be calculated.
• A comparison is made between the calculated net input and the threshold to
obtain the network output.
•
McCulloch-Pitts Neuron
• The McCulloch-Pitts neuron was the earliest neural network discovered in
1943.
• It is usually called as M-P neuron.
• The M-P neurons are connected by directed weighted paths.
• The activation of a M-P neuron is binary, that is,
at any time step the neuron may fire or may not fire
• The weights associated with the communication links may be excitatory
(weight is positive) or inhibitory (weight is negative).
• All the excitatory connected weights entering into a particular neuron will
have same weights.
McCulloch-Pitts Neuron
• The threshold plays a major role in M-P neuron:
There is a fixed threshold for each neuron,
and if the net input to the neuron is greater than the threshold
then the neuron fires
Any nonzero inhibitory input would
prevent the neuron from firing.
• The M-P neurons are most widely used in the case of logic function.
McCulloch-Pitts Neuron- Architecture
• the M-P neuron has both excitatory and
inhibitory connections.
• It is excitatory with weight (w > 0) or inhibitory
with weight -p(p < 0).
inputs x1 to xn possess excitatory weighted connections xn
and
inputs from Xn+ 1 to Xn+m possess inhibitory weighted
interconnections.
• Since the firing of the output neuron is based
upon the threshold,
the activation function here is defined as
McCulloch-Pitts Neuron
• For inhibition to be absolute, the threshold with the activation function
should satisfy the following condition:
• The output will fire if it receives say “k” or more excitatory inputs but no
inhibitory inputs, where
• Here the weights of the neuron are set along with the threshold to make the
neuron perform a simple logic function
• The M –P neurons are used as buildings blocks on which we can model any
function or phenomenon,
which can be represented as a logic function.
Q) Implement AND function using McCulloch-Pitts
neuron (take binary data).
• Solution: Consider the truth table for AND function (Table 1).
• In McCulloch-Pitts neuron, only analysis is being performed.
• Hence, assume the weights be w1 = 1 and w2 = 1.
• The network architecture is shown in Figure.
• With these assumed weights, the net input is calculated for four inputs: For
inputs (w1=w2=1)
• (x1,x2)
• For an AND function, • This can also be obtained by
the output is high if both the inputs are • Here, n= 2, w = 1 (excitatory weights) and
high. p = 0 (no inhibitory weights). Substituting
• For this condition, the net input is calculated these values in the above mentioned
as 2. equation we get
• Hence, based on this net input, the threshold • Thus, the output of neuron Y can be written
is set,
as
o i.e. if the threshold value is greater than or equal to 2
then the neuron fires, else it does not fire.
o So the threshold value is set equal to 2(Ɵ= 2).
Q)Implement ANDNOT function using McCulloch-Pitts
neuron (use binary data representation).
Solution:
• In the case of ANDNOT function, the response is
true if the first input is true and the second input is
false.
• For all other input variations, the response is false.
• The given function gives an output only when
x1 = 1 and x2= 0.
• The weights have to be decided only after the
analysis. The net can be represented as shown in
Figure.
• AND NOT A AND (NOT B) 1 AND (NOT 0) 1 AND 11
• The truth table for AND NOT function is given in
Table.
Case 1:
• Assume that both weights w1 and w2 are excitatory, i.e., w1=w2=1
• Then for the four inputs calculate the net input using yin=x1w1+x2w2
• From the calculated net inputs, it is not possible to fire the neuron for input
(1, 0) only.
• Hence, these weights are not suitable.
Case 2 :
• Assume one weight as excitatory and the other as inhibitory, i.e.,w1 =1, w2=-1
• From the calculated net inputs, now it is possible to fire the neuron for input (1, 0) only by
fixing a threshold of 1, i.e.,Ɵ ≥ 1 for Y unit. Thus,
w1=1; w2= -1; Ɵ ≥ 1
• Note: The value of Ɵ is calculated using the following:
Q)Implement XOR function using McCulloch-Pitts neuron (consider binary data).
The final weights obtained after presenting all the input patterns
do not give correct output for all patterns .