Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
0% found this document useful (0 votes)
4 views

Autoencoder.ipynb - Colaboratory

The document outlines the process of building and training an autoencoder using TensorFlow and Keras on the Iris dataset. It includes steps for data normalization, model configuration, compilation, training, and visualization of the encoded data in a 2D space. The final output is a scatter plot that illustrates the reduced dimensionality of the data.

Uploaded by

cookiesntacos
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
4 views

Autoencoder.ipynb - Colaboratory

The document outlines the process of building and training an autoencoder using TensorFlow and Keras on the Iris dataset. It includes steps for data normalization, model configuration, compilation, training, and visualization of the encoded data in a 2D space. The final output is a scatter plot that illustrates the reduced dimensionality of the data.

Uploaded by

cookiesntacos
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 3

2/15/24, 11:53 PM Autoencoder.

ipynb - Colaboratory

1) Import 'numpy' which we will use to manipulate our data and 'tensorflow' for building and training the autoencoder. Note: we're also reusing
'matplotlib.pyplot' and 'sklearn.datasets' but if you're completing this exercise in the same project file you will not have to import these too.

import numpy as np
import tensorflow as tf
from tensorflow.keras.layers import Input, Dense
from tensorflow.keras.models import Model
import matplotlib.pyplot as plt
from sklearn import datasets

iris = datasets.load_iris()
data = iris.data

2) Next, we normalize the features in our data by subtracting the mean and dividing by the standard deviation. This is a crucial step for training
our autoencoder.

data -= np.mean(data, axis=0)


data /= np.std(data, axis=0)

3) After our data has been normalized we're ready to configure the autoencoder. We define a simple autoencoder architecture using tensorflow
and keras. Then, we use an encoding dimension of 2 to reduce this data to a 2D representation.

encoding_dim = 2 # 2D representation
input_data = Input(shape=(4,))
encoded = Dense(encoding_dim, activation='relu')(input_data)
decoded = Dense(4, activation='sigmoid')(encoded)
autoencoder = Model(input_data, decoded)

4) Just like our models in previous exercises we now need to compile and train the autoencoder. We compile this with the Adam optimizer and
mean squared error loss function. We then train the model using our now normalized data.

autoencoder.compile(optimizer='adam', loss='mean_squared_error')
autoencoder.fit(data, data, epochs=100, batch_size=10, shuffle=True)

https://colab.research.google.com/drive/1pRhO_YeP5aBPxDrqo3F3L_UgdSZHpOaP#scrollTo=0MGzbxyAOgJX&printMode=true 1/3
2/15/24, 11:53 PM Autoencoder.ipynb - Colaboratory
Epoch 68/100
15/15 [==============================] - 0s 3ms/step - loss: 0.7746
Epoch 69/100
15/15 [==============================] - 0s 3ms/step - loss: 0.7707
Epoch 70/100
15/15 [==============================] - 0s 3ms/step - loss: 0.7666
Epoch 71/100
15/15 [==============================] - 0s 3ms/step - loss: 0.7631
Epoch 72/100
15/15 [==============================] - 0s 2ms/step - loss: 0.7596
Epoch 73/100
15/15 [==============================] - 0s 2ms/step - loss: 0.7564
Epoch 74/100
15/15 [==============================] - 0s 3ms/step - loss: 0.7532
Epoch 75/100
15/15 [==============================] - 0s 2ms/step - loss: 0.7504
Epoch 76/100
15/15 [==============================] - 0s 2ms/step - loss: 0.7476
Epoch 77/100
15/15 [==============================] - 0s 2ms/step - loss: 0.7450
Epoch 78/100
15/15 [==============================] - 0s 2ms/step - loss: 0.7425
Epoch 79/100
15/15 [==============================] - 0s 2ms/step - loss: 0.7402
Epoch 80/100
15/15 [==============================] - 0s 2ms/step - loss: 0.7379
Epoch 81/100

5) After that, we can create and use our encoder model. We extract the encoder part of the autoencoder and use it to encode the data which
reduces its dimensionality.

encoder = Model(input_data, encoded)


encoded_data = encoder.predict(data)

5/5 [==============================] - 0s 2ms/step

6) Now we're ready to visualize the encoded data and compare the two visualizations. Create a scatter plot of the 2D encoded data. The
resulting plot shows how the data looks in a reduced dimensionality space.

plt.figure(figsize=(8, 6))
plt.scatter(encoded_data[:, 0], encoded_data[:, 1], c=iris.target)
plt.colorbar()
plt.xlabel('Encoded Feature 1')
plt.ylabel('Encoded Feature 2')
plt.title('2D Visualization of Encoded Iris Data')
plt.show()

https://colab.research.google.com/drive/1pRhO_YeP5aBPxDrqo3F3L_UgdSZHpOaP#scrollTo=0MGzbxyAOgJX&printMode=true 2/3
2/15/24, 11:53 PM Autoencoder.ipynb - Colaboratory

https://colab.research.google.com/drive/1pRhO_YeP5aBPxDrqo3F3L_UgdSZHpOaP#scrollTo=0MGzbxyAOgJX&printMode=true 3/3

You might also like