Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
100% found this document useful (1 vote)
85 views

MATLAB Codes (CNN, LSTM)

The document provides examples of implementing convolutional neural networks and recurrent neural networks for classification using MATLAB. It uses the Fashion MNIST dataset for the CNN example, loading the data, defining and training the CNN model. For the RNN example it uses the Iris dataset, defining and training an RNN model with LSTM layers for classification.

Uploaded by

Eman Naeem
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
100% found this document useful (1 vote)
85 views

MATLAB Codes (CNN, LSTM)

The document provides examples of implementing convolutional neural networks and recurrent neural networks for classification using MATLAB. It uses the Fashion MNIST dataset for the CNN example, loading the data, defining and training the CNN model. For the RNN example it uses the Iris dataset, defining and training an RNN model with LSTM layers for classification.

Uploaded by

Eman Naeem
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 7

MATLAB CODE:

CNN
Below is an example of implementing a Convolutional Neural Network (CNN) for classification
using MATLAB. In this example, I'll use the Fashion MNIST dataset, which is a common
benchmark for image classification tasks.
Ensure you have the Deep Learning Toolbox installed in MATLAB to run this code. You can
modify the architecture, hyperparameters, and other settings based on your specific dataset and
classification task.
This example uses a simple CNN architecture with two convolutional layers followed by max
pooling, and a fully connected layer for classification. Adjust the architecture as needed for your
specific problem.

About Dataset
Context
Fashion-MNIST is a dataset of Zalando's article images—consisting of a training set of 60,000 examples and a
test set of 10,000 examples. Each example is a 28x28 grayscale image, associated with a label from 10 classes.
Zalando intends Fashion-MNIST to serve as a direct drop-in replacement for the original MNIST dataset for
benchmarking machine learning algorithms. It shares the same image size and structure of training and testing
splits.

The original MNIST dataset contains a lot of handwritten digits. Members of the AI/ML/Data Science
community love this dataset and use it as a benchmark to validate their algorithms.

Labels
Each training and test example is assigned to one of the following labels:

 0 T-shirt/top
 1 Trouser
 2 Pullover
 3 Dress
 4 Coat
 5 Sandal
 6 Shirt
 7 Sneaker
 8 Bag
 9 Ankle boot
% Load Fashion MNIST dataset
[fashionTrain, fashionTest] = fashion_mnist_data;

% Extract features and labels


XTrain = fashionTrain.images;
YTrain = categorical(fashionTrain.labels);

XTest = fashionTest.images;
YTest = categorical(fashionTest.labels);

% Define CNN architecture


layers = [
imageInputLayer([28 28 1]) % Input layer for 28x28 grayscale images
convolution2dLayer(3, 32, 'Padding', 'same', 'Activation', 'relu')
% Convolutional layer with 32 filters
maxPooling2dLayer(2, 'Stride', 2) % Max pooling layer
convolution2dLayer(3, 64, 'Padding', 'same', 'Activation', 'relu')
% Another convolutional layer with 64 filters
maxPooling2dLayer(2, 'Stride', 2) % Another max pooling layer
fullyConnectedLayer(10, 'Activation', 'softmax')
% Fully connected layer with 10 classes (Fashion MNIST has 10 classes)
classificationLayer()
];
% Define training options
options = trainingOptions('adam', ...
'MaxEpochs', 10, ...
'MiniBatchSize', 128, ...
'Shuffle', 'every-epoch', ...
'ValidationData', {XTest, YTest}, ...
'Verbose', 1, ...
'Plots', 'training-progress');

% Train the CNN


net = trainNetwork(XTrain, YTrain, layers, options);

% Make predictions on the test set


YPred = classify(net, XTest);

% Evaluate the accuracy


accuracy = sum(YTest == YPred) / numel(YTest);
disp(['Accuracy: ' num2str(accuracy * 100) '%']);
RNN
Data set
Fisher's iris data consists of measurements (meas) on the sepal length, sepal width, petal length,
and petal width for 150 iris specimens. There are 50 specimens from each of three species.

Iris setosa
Iris versicolor

Iris virginica

% Load Iris dataset


load fisheriris;
% Extract features and labels
X = meas';
labels = categorical(grp2idx(species));
labels2 = double(labels);

% One-hot encode labels


Y = ind2vec(labels2’);

% Split the data into training and testing sets


splitRatio = 0.8;
numSamples = size(X, 2);
splitIdx = floor(splitRatio * numSamples);
XTrain = X(:, 1:splitIdx);
YTrain = Y(:, 1:splitIdx);

XTest = X(:, splitIdx+1:end);


YTest = Y(:, splitIdx+1:end);

% Define RNN architecture using LSTM layer


numFeatures = size(XTrain, 1);
numClasses = size(YTrain, 1);

layers = [
sequenceInputLayer(numFeatures)
lstmLayer(50, 'OutputMode', 'last')
fullyConnectedLayer(numClasses)
softmaxLayer
classificationLayer
];

% Define training options


options = trainingOptions('adam', ...
'MaxEpochs', 20, ...
'MiniBatchSize', 16, ...
'Shuffle', 'every-epoch', ...
'ValidationData', {XTest, YTest}, ...
'Verbose', 1, ...
'Plots', 'training-progress');
% Train the RNN
net = trainNetwork(XTrain, YTrain, layers, options);

% Make predictions on the test set


YPred = classify(net, XTest);

% Evaluate the accuracy


accuracy = sum(YTest == YPred) / numel(YTest);
disp(['Accuracy: ' num2str(accuracy * 100) '%']);

You might also like