Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
0% found this document useful (0 votes)
140 views

Artificial Neural Network Using Python

This document defines a neural network class with methods for initializing weights, applying the sigmoid activation function and its derivative, training the network by adjusting weights using backpropagation, and making predictions on new input data. It initializes a network, trains it on 4 examples over 15,000 iterations, and then uses the trained network to make a prediction for new user input data of [1, 0, 0], outputting 0.9999584.

Uploaded by

Poornima Ghodke
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
140 views

Artificial Neural Network Using Python

This document defines a neural network class with methods for initializing weights, applying the sigmoid activation function and its derivative, training the network by adjusting weights using backpropagation, and making predictions on new input data. It initializes a network, trains it on 4 examples over 15,000 iterations, and then uses the trained network to make a prediction for new user input data of [1, 0, 0], outputting 0.9999584.

Uploaded by

Poornima Ghodke
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 3

1/22/2021 Untitled10

localhost:8888/nbconvert/html/Untitled10.ipynb?download=false 1/3
1/22/2021 Untitled10

In [3]: import numpy as np

class NeuralNetwork():

def __init__(self):
# seeding for random number generation
np.random.seed(1)

#converting weights to a 3 by 1 matrix with values from -1 to 1 and me


an of 0
self.synaptic_weights = 2 * np.random.random((3, 1)) - 1

def sigmoid(self, x):


#applying the sigmoid function
return 1 / (1 + np.exp(-x))

def sigmoid_derivative(self, x):


#computing derivative to the Sigmoid function
return x * (1 - x)

def train(self, training_inputs, training_outputs, training_iterations):

#training the model to make accurate predictions while adjusting weigh


ts continually
for iteration in range(training_iterations):
#siphon the training data via the neuron
output = self.think(training_inputs)

#computing error rate for back-propagation


error = training_outputs - output

#performing weight adjustments


adjustments = np.dot(training_inputs.T, error * self.sigmoid_deriv
ative(output))

self.synaptic_weights += adjustments

def think(self, inputs):


#passing the inputs via the neuron to get output
#converting values to floats

inputs = inputs.astype(float)
output = self.sigmoid(np.dot(inputs, self.synaptic_weights))
return output

if __name__ == "__main__":

#initializing the neuron class


neural_network = NeuralNetwork()

print("Beginning Randomly Generated Weights: ")


print(neural_network.synaptic_weights)

#training data consisting of 4 examples--3 input values and 1 output


training_inputs = np.array([[0,0,1],

localhost:8888/nbconvert/html/Untitled10.ipynb?download=false 2/3
1/22/2021 Untitled10

[1,1,1],
[1,0,1],
[0,1,1]])

training_outputs = np.array([[0,1,1,0]]).T

#training taking place


neural_network.train(training_inputs, training_outputs, 15000)

print("Ending Weights After Training: ")


print(neural_network.synaptic_weights)

user_input_one = str(input("User Input One: "))


user_input_two = str(input("User Input Two: "))
user_input_three = str(input("User Input Three: "))

print("Considering New Situation: ", user_input_one, user_input_two, user_


input_three)
print("New Output data: ")
print(neural_network.think(np.array([user_input_one, user_input_two, user_
input_three])))
print("Wow, we did it!")
Beginning Randomly Generated Weights:
[[-0.16595599]
[ 0.44064899]
[-0.99977125]]
Ending Weights After Training:
[[10.08740896]
[-0.20695366]
[-4.83757835]]
User Input One: 1
User Input Two: 0
User Input Three: 0
Considering New Situation: 1 0 0
New Output data:
[0.9999584]
Wow, we did it!

In [ ]:

localhost:8888/nbconvert/html/Untitled10.ipynb?download=false 3/3

You might also like