7_3_6_Building_Support_Vector_Machine_Classifier_from_Scratch_in_Python
7_3_6_Building_Support_Vector_Machine_Classifier_from_Scratch_in_Python
SVM Classifier
Equation of the Hyperplane:
y = wx - b
Gradient Descent:
Gradient Descent is an optimization algorithm used for minimizing the loss function in various
machine learning algorithms. It is used for updating the parameters of the learning model.
w = w - �*dw
b = b - �*db
Learning Rate:
Learning rate is a tuning parameter in an optimization algorithm that determines the step size at
each iteration while moving toward a minimum of a loss function.
Importing the Dependencies
[ ]: # importing numpy library
import numpy as np
self.learning_rate = learning_rate
self.no_of_iterations = no_of_iterations
self.lambda_parameter = lambda_parameter
1
# m --> number of Data points --> number of rows
# n --> number of input features --> number of columns
self.m, self.n = X.shape
self.w = np.zeros(self.n)
self.b = 0
self.X = X
self.Y = Y
for i in range(self.no_of_iterations):
self.update_weights()
# label encoding
y_label = np.where(self.Y <= 0, -1, 1)
if (condition == True):
dw = 2 * self.lambda_parameter * self.w
db = 0
else:
2
self.b = self.b - self.learning_rate * db
predicted_labels = np.sign(output)
return y_hat