Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

Brian Muinde P101/0928G/17

Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 11

BRIAN MUINDE

P101/0928G/17

ARTIFICIAL NEURAL NETWORK

CAT 1

ATTEMPT ALL QUESTIONS

SUBMIT ONLY THROUGH ODEL PLATFORM

DUE DATE: 21st wed 2021 by 7.00 PM

TYPED ONLY

QUESTION ONE

a) Briefly explain the following concepts

i. learning (2 Marks)

is the ability of the network to learn from its environment, and to improve its performance
through learning.

ii. neural networks (2 Marks)

are networks of neurons, for example, as found


in real (i.e. biological) brains

iii. Artificial Neuron (2 Marks)

are crude approximations of the neurons found in real brains. They may be physical devices, or
purely mathematical constructs.

b) Describe four motivations of machine learning technology (4 Marks)

Can computers take decisions (even smart smarter ones) just like humans?

Can computers help humans in doing their tasks of daily living?


Can we build a smart eco-system where users get feedback and systems can update their
actions?

Can we develop techonlogy to learn from human behaviour?

c) briefly explain five components of a learning agent (4 Marks)

The performance element component selects the action to take.

The critic element gives feedback.

The learning element uses the feedback to make the action better next time, and.

The problem generator suggests new experiences for the learning agent to learn from.

d) Describe the importance of the following components of a biological neuron (5 Marks)

i. Soma

processes the incoming activations and converts them into output activations.

ii. Dendrite

are fibres which emanate from the cell body and provide the receptive zone that receive
activation from other neurons.

iii. Axon Hillock

is a specialized part of the cell body (or soma) of a neuron that connects to the axon. It can be
identified using light microscopy from its appearance and location in a neuron and from its
sparse distribution of Nissl substance.

iv. Myelin sheath

is an insulating layer, or sheath that forms around nerves, including those in the brain and
spinal cord. It is made up of protein and fatty substances.

v. Nodes of ranvier

periodic gap in the insulating sheath (myelin) on the axon of certain neurons that serves to
facilitate the rapid conduction of nerve impulses
e) Briefly explain how information flow in a neural cell (6 Marks)

Working of a Biological Neuron

As shown in the above diagram, a typical neuron consists of the following four parts with the
help of which we can explain its working −

Dendrites − They are tree-like branches, responsible for receiving the information from other
neurons it is connected to. In other sense, we can say that they are like the ears of neuron.

Soma − It is the cell body of the neuron and is responsible for processing of information, they
have received from dendrites.

Axon − It is just like a cable through which neurons send the information.

Synapses − It is the connection between the axon and other neuron dendrites.

f) Describe the significance of synapse strength between any two neurons (2 Marks).

is its ability to show synaptic plasticity, and this is the fundamental property of neurons that
confers the human brain its capacity for memory and learning, and intelligence – which in turn
forms the basis of all higher intellectual functions.

g) Briefly explain three applications that are performed by artificial neural networks. (3 Marks)

1. Computer games – intelligent agents, chess, backgammon

2. Robotics – autonomous adaptable robots

3. Pattern recognition – speech recognition, seismic activity, sonar signals


4. Data analysis – data compression, data mining

QUESTION TWO

a) Briefly describe the flow of information in Mcculloch-pitts Neuron (4 Marks).

This vastly simplified model of real neurons is also known as a Threshold Logic Unit:

1.A set of synapses (i.e. connections) brings in activations from other neurons.

2.A processing unit sums the inputs, and then applies a nonlinear activation function

3.An output line transmits the result to other neurons.

How it works:

• Each input Ii is multiplied by a weight wji (synaptic strength)

• These weighted inputs are summed to give the activation level, Aj

• The activation level is then transformed by an activation function to produce the neuron’s
output, Yi

• Wji is known as the weight from unit i to unit j

– Wji > 0, synapse is excitatory

– Wji < 0, synapse is inhibitory

• Note that Ii may be

– External input
– The output of some other neuron

b) state any three features that are missing in mcculloch-pitts neuron model (3 Marks)

In this McCulloch-Pitts neuron model, the missing features are :

- Non-binary input and output,

- Non-linear summation,
- Smooth thresholding,

- Stochastic, and

- Temporal information processing.

c) Artificial neuron consists of four basic components. Describe each of these components. Use
a diagram to illustrate your answer (6 Marks)

Dendrites − They are tree-like branches, responsible for receiving the information from other
neurons it is connected to. In other sense, we can say that they are like the ears of neuron.

Soma − It is the cell body of the neuron and is responsible for processing of information, they
have received from dendrites.

Axon − It is just like a cable through which neurons send the information.

Synapses − It is the connection between the axon and other neuron dendrites.

d) There are two types of threshold functions. State and explain each of them. Use a diagram to
illustrate your answer (6 Marks)

Different threshold functions: (a) hard threshold function; (b) soft threshold function; (c) half-
soft threshold function; (d) improved halfsoft threshold function.
e) Explain the significance of synapse strength in a biological neuron (1 Mark)

is the ability of synapses to strengthen or weaken over time, in response to increases or


decreases in their activity.

QUESTION THREE

a) Briefly explain the meaning of the following terms

i. back propagation (2 Marks)

It is the method of fine-tuning the weights of a neural net based on the error rate obtained in
the previous epoch (i.e., iteration). Proper tuning of the weights allows you to reduce error
rates and to make the model reliable by increasing its generalization.

ii. Network training: (2 Marks)

involves feedforward of data signals to generate the output and then the backpropagation of
errors for gradient descent optimization.

iii. Epoch (2 Marks)

is when an ENTIRE dataset is passed forward and backward through the neural network only
ONCE. Since one epoch is too big to feed to the computer at once we divide it in several smaller
batches.

b) Briefly explain five properties of neural networks (5 Marks)

 They can make dumb errors


 They are strangely non-convex
 They work best when badly trained
 They can easily memorise and be compressed
 Yet they forget what they learned
 They are influenced by initialisation/first examples

c) Describe four challenges of implement neural networks in machine learning (4 Marks)

 You need time to achieve any satisfying results and planning is difficult
 Data is not free at all. to train a machine learning model, you need big sets of data. It
may seem that it's not a problem anymore, since everyone can afford to store and
process petabytes of information.
 Talent deficit. Although many people are attracted to the machine learning industry,
there are still very few specialists that can develop this technology.
 The black box problem. The early stages of machine learning belonged to relatively
simple, shallow methods.

d) Neural network has three main types of layers. State and explain each of these layers.

Use a diagram to illustrate your answer (4 Marks)

Input layer — initial data for the neural network.

Hidden layers — intermediate layer between input and output layer and place where all the
computation is done.

Output layer — produce the result for given inputs.

e) Explain the meaning of the term ‘activation’ (1 Mark).

function in a neural network defines how the weighted sum of the input is transformed into an
output from a node or nodes in a layer of the network. Sometimes the activation function is
called a “transfer function.”

QUESTION FOUR

a) Briefly explain the meaning of the following terms

i. Bias (2 Marks).
is like the intercept added in a linear equation. It is an additional parameter in the Neural
Network which is used to adjust the output along with the weighted sum of the inputs to the
neuron. Thus, Bias is a constant which helps the model in a way that it can fit best for the given
data.

ii. Convergence (2 Marks)

describes a progression towards a network state where the network has learned to properly
respond to a set of training patterns within some margin of error.

iii. SOM (2 Marks)

A self-organizing map or self-organizing feature map is a type of artificial neural network that is
trained using unsupervised learning to produce a low-dimensional, discretized representation
of the input space of the training samples, called a map, and is therefore a method to do
dimensionality reduction.

b) Briefly explain any four situations recommended for ANN. (4 Marks).

Handwriting Recognition – The idea of Handwriting recognition has become very important.
This is because handheld devices like the Palm Pilot are becoming very popular. Hence, we can
use Neural networks to recognize handwritten characters.

Traveling Salesman Problem – Neural networks can also solve the traveling salesman problem.
But this is to a certain degree of approximation only.

Image Compression – Vast amounts of information is received and processed at once by neural
networks. This makes them useful in image compression. With the Internet explosion and more
sites using more images on their sites, using neural networks for image compression is worth a
look.

Stock Exchange Prediction – The day-to-day business of the stock market is very complicated.
Many factors weigh in whether a given stock will go up or down on any given day.

c) State any three parameters that are set in neural networks (3 Marks)

Learning rate: this hyperparameter refers to the step of backpropagation, when parameters are
updated according to an optimization function.
Momentum: it is a technique used during the backpropagation phase. As said regarding the
learning rate, parameters are updated so that they can converge towards the minimum of the
loss function.

Minibatch size: when you are facing billions of data, it might result inefficient (as well as
counterproductive) feeding your NN with all of them.

d) State and explain three types of training network (3 Marks)

Unsupervised learning. They revolve around the model learning complex relationships within
data that you haven’t been able to determine yet. This can be through tasks such as clustering
data-points, which help to give insight to the structure of the data.

Reinforcement learning. Reinforcement learning is an interesting mix of both supervised and


unsupervised learning.

Supervised learning

e) State and explain two properties of a self organizing network (4 Marks)

Self-Configuration of Self-Organizing Networks – features for the auto-setup of the new


node/cell.

Self-Optimization – includes features for the network and radio optimization during network
operation.

Self-Healing – relates to features that are utilized during network operation upon cell failure.

QUESTION FIVE

a) Briefly explain the meaning of the term ‘hope field network’. Use a diagram to illustrate
youranswer. (3 Marks).

Hopfield network is a special kind of neural network whose response is different from other
neural networks. It is calculated by converging iterative process. It has just one layer of neurons
relating to the size of the input and output, which must be the same. When such a network
recognizes, for example, digits, we present a list of correctly rendered digits to the network.
Subsequently, the network can transform a noise input to the relating perfect output.
b) Describe four properties of hope field network (4 Marks)

This model consists of neurons with one inverting and one non-inverting output.

The output of each neuron should be the input of other neurons but not the input of self.

Weight/connection strength is represented by wij.

Connections can be excitatory as well as inhibitory. It would be excitatory, if the output of the
neuron is same as the input, otherwise inhibitory.

Weights should be symmetrical, i.e. wij = wji

c) Use back propagation algorithm to compute one training pass on the following neural
network. (6 Marks)

Target = 0.5, Learning rate = 1

d) State and explain any two types radial basis functions (4 Marks)

e) Explain the problem addressed by ‘stability-plasticity dilemma’ in the context of Adaptive


resonance theory (1 Mark).
Real world is faced with a situations where data is continuously changing.In such situation,
every learning system faces plasticity-stability dilemma. system that must be able tolearn to
adapt to a changing environment (ie it must be plastic but the constantchange can male the
system unstable, because the system may learn newinformation only by forgetting everything it
has so far learned'.This phenomenon, a contradiction between plasticity and stability, is called
plasticity - stability dilemma.The bac-propagation algorithm suffer from such stability
problem.&daptive Resonance Theory to solve plasticity-stability dilemma. &RT has a self-
regulatingcontrol structure that allows autonomous recognition and learning.&RT requires no
supervisory control or algorithmic implementation.

f) Explain types of Adaptive resonance theory architectures (2 Marks)

ART1 – It is the simplest and the basic ART architecture. It is capable of clustering binary input
values.

ART2 – It is extension of ART1 that is capable of clustering continuous-valued input data.

Fuzzy ART – It is the augmentation of fuzzy logic and ART.

ARTMAP – It is a supervised form of ART learning where one ART learns based on the previous
ART module. It is also known as predictive ART.

FARTMAP – This is a supervised ART architecture with Fuzzy logic included.

You might also like