Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

TYCS AI Lab Manual Merged

Download as pdf or txt
Download as pdf or txt
You are on page 1of 43

T.Y.B.Sc.

COMPUTER SCIENCE
SEMESTER: V

Lab Manual

Subject Code: USCSP501


Subject Name: Artificial Intelligence

Course Writer:
Prof. Hasan Phudinawala
TYCS SEM V USCSP501 ARTIFICIAL INTELLIGENCE BY: - PROF. HASAN PHUDINAWALA

PRACTICAL 1
A. Aim: Implement Breadth First Search Algorithm
Dataset: RMP.py File
Requirement: RMP.py, Python IDLE

Diagram:

Page | 1
TYCS SEM V USCSP501 ARTIFICIAL INTELLIGENCE BY: - PROF. HASAN PHUDINAWALA

A. Code: Implement the Breadth First Search algorithm to solve a given problem.
import queue as Q
from RMP import dict_gn
start = 'Arad'
goal = "Bucharest"
result=''
def BFS(city,cityq,visitedq):
global result
if city==start:
result = result + "" + city
for eachcity in dict_gn[city].keys():
if eachcity==goal:
result = result + " " + eachcity
return
if eachcity not in cityq.queue and eachcity not in visitedq.queue:
cityq.put(eachcity)
result = result + " " + eachcity
visitedq.put(city)
BFS(cityq.get(),cityq,visitedq)
def main():
cityq = Q.Queue()
visitedq = Q.Queue()
BFS(start,cityq,visitedq)
print("BFS Traversal From ",, start," to " , goal, "is :")
print(result)
main()

Output:

Page | 2
TYCS SEM V USCSP501 ARTIFICIAL INTELLIGENCE BY: - PROF. HASAN PHUDINAWALA

B. Code: Implement the Iterative Depth First Search algorithm to solve the same problem.

import queue as Q
from RMP import dict_gn
start = "Arad"
goal = "Bucharest"
result = ""

def DLS(city,visitedstack,startlimit,endlimit):
global result
found = 0
result = result + city + " "
visitedstack.append(city)
if city == goal:
return 1
if startlimit == endlimit:
return 0
for eachcity in dict_gn[city].keys():
if eachcity not in visitedstack:
found = DLS(eachcity,visitedstack,startlimit+1,endlimit)
if found:
return found

def IDDFS(city,visitedstack,endlimit):
global result
for i in range(0,endlimit):
print("Seaching at Limit:", i)
found = DLS(city,visitedstack, 0 , i)
if found:
print("Found")
break
else:
print("Not Found!")
print(result)
print("______")
result=""
visitedstack = []

Page | 3
TYCS SEM V USCSP501 ARTIFICIAL INTELLIGENCE BY: - PROF. HASAN PHUDINAWALA

def main():
visitedstack = []
IDDFS(start,visitedstack,9)
print("IDDFS Traversal from ", start, " to ",goal," is:")
print(result)
main()

Output:

Page | 4
TYCS SEM V USCSP501 ARTIFICIAL INTELLIGENCE BY: - PROF. HASAN PHUDINAWALA

PRACTICAL 2
AIM: A* Search and Recursive Best-First Search

Dataset: RMP.py File

Code: Implement the A* Search algorithm for solving a pathfinding problem.


import queue as Q
from RMP import dict_gn
from RMP import dict_hn

start = 'Arad'
goal = 'Bucharest'
result = ''

def get_fn(citystr):
cities=citystr.split(",")
hn=gn=0
for ctr in range(0, len(cities)-1):
gn=gn+dict_gn[cities[ctr]][cities[ctr+1]]
hn=dict_hn[cities[len(cities)-1]]
return(hn+gn)
def expand(cityq):
global result
tot, citystr, thiscity=cityq.get()
if thiscity==goal:
result=citystr+"::"+str(tot)
return
for cty in dict_gn[thiscity]:
cityq.put((get_fn(citystr+","+cty),citystr+","+cty,cty))
expand(cityq)
def main():
cityq=Q.PriorityQueue()
thiscity=start
cityq.put((get_fn(start),start,thiscity))
expand(cityq)
print("The A* path with the total is: ")
print(result)
main()

Page | 5
TYCS SEM V USCSP501 ARTIFICIAL INTELLIGENCE BY: - PROF. HASAN PHUDINAWALA

Output:

Code: Implement the Recursive Best


Best-First
First Search algorithm for the same problem.
import queue as Q
from RMP import dict_gn
from RMP import dict_hn

start = 'Arad'
goal = 'Bucharest'
result = ''
def get_fn(citystr):
cities=citystr.split(",")
hn=gn=0
for ctr in range(0, len(cities)-1):
1):
gn=gn+dict_gn[cities[ctr]][cities[ctr+1]]
hn=dict_hn[cities[len(cities)-1]]1]]
return(hn+gn)
def printout(cityq):
for i in range(0, cityq.qsize()):
print(cityq.queue[i])
def expand(cityq):
global result
tot, citystr, thiscity = cityq.get()
nexttot = 999
if not cityq.empty():
nexttot,nextcitystr,nextthiscity=cityq.queue[0]
if thiscity== goal and tot < nexttot:

Page | 6
TYCS SEM V USCSP501 ARTIFICIAL INTELLIGENCE BY: - PROF. HASAN PHUDINAWALA

result = citystr + "::" + str(tot)


return
print("Expaded city ---------", thiscity)
print("second best f(n)---------", nexttot)
tempq = Q.PriorityQueue()
for cty in dict_gn[thiscity]:
tempq.put((get_fn(citystr+','+cty), citystr+','+cty, cty))
for ctr in range(1,3):
ctrtot, ctrcitystr ,ctrthiscity = tempq.get()
if ctrtot < nexttot:
cityq.put((ctrtot, ctrcitystr,ctrthiscity))
else:
cityq.put((ctrtot, citystr, thiscity))
break
printout(cityq)
expand(cityq)

def main():
cityq=Q.PriorityQueue()
thiscity=start
cityq.put((999, "NA", "NA"))
cityq.put((get_fn(start), start, thiscity))
expand(cityq)
print(result)
main()
Output:

Page | 7
TYCS SEM V USCSP501 ARTIFICIAL INTELLIGENCE BY: - PROF. HASAN PHUDINAWALA

Page | 8
TYCS SEM V USCSP501 ARTIFICIAL INTELLIGENCE BY: - PROF. HASAN PHUDINAWALA

PRACTICAL NO: 3
Aim: Implement the decision tree learni
learning
ng algorithm to build a decision tree for a given
dataset. Evaluate the accuracy and efficiency on the test data set.

Page | 9
TYCS SEM V USCSP501 ARTIFICIAL INTELLIGENCE BY: - PROF. HASAN PHUDINAWALA

Page | 10
TYCS SEM V USCSP501 ARTIFICIAL INTELLIGENCE BY: - PROF. HASAN PHUDINAWALA

Page | 11
TYCS SEM V USCSP501 ARTIFICIAL INTELLIGENCE BY: - PROF. HASAN PHUDINAWALA

Page | 12
TYCS SEM V USCSP501 ARTIFICIAL INTELLIGENCE BY: - PROF. HASAN PHUDINAWALA

Page | 13
TYCS SEM V USCSP501 ARTIFICIAL INTELLIGENCE BY: - PROF. HASAN PHUDINAWALA

PRACTICAL NO: 4
AIM: Feed Forward Back propagation Neural Network
● Implement the Feed Forward Back propagation algorithm to train a neural network
● Use a given dataset to train the neural network for a specific task

Requirement: Python IDLE

Code:
from doctest import OutputChecker
import numpy as np
class NeuralNetwork():
def __init__(self):
np.random.seed()
self.synaptic_weights=2*np.random.random((3,1))-1
def sigmoid(self,x):
return 1/(1+np.exp(-x))
def sigmoid_derivative(self,x):
return x*(1-x)
def train(self,training_inputs,training_outputs,training_iterations):
for iteration in range(training_iterations):
output=self.think(training_inputs)
error = training_outputs-output
adjustments=np.dot(training_inputs.T,error*self.sigmoid_derivative(output))
self.synaptic_weights +=adjustments
def think(self,inputs):
inputs=inputs.astype(float)
output=self.sigmoid(np.dot(inputs,self.synaptic_weights))
return output

if __name__ == "__main__":
#initializing the neuron class
neural_network = NeuralNetwork()
print("Beginning Randomly Generated Weights: ")
print(neural_network.synaptic_weights)
#training data consisting of 4 examples--3 input values and 1 output
training_inputs = np.array([[0,0,1],
[1,1,1],
[1,0,1],
[0,1,1]])

Page | 14
TYCS SEM V USCSP501 ARTIFICIAL INTELLIGENCE BY: - PROF. HASAN PHUDINAWALA

training_outputs = np.array([[0,1,1,0]]).T
#training taking place
neural_network.train(training_inputs,
nputs, training_outputs, 15000)
print("Ending Weights After Training: ")
print(neural_network.synaptic_weights)
user_input_one = str(input("User Input One: "))
user_input_two = str(input("User Input Two: "))
user_input_three = str(input("User Input Three: "))
print("Considering New Situation: ", user_input_one, user_input_two, user_input_three)
print("New Output data: ")
print(neural_network.think(np.array([user_input_one, user_input_two, user_input_three])))

Output:

Page | 15
TYCS SEM V USCSP501 ARTIFICIAL INTELLIGENCE BY: - PROF. HASAN PHUDINAWALA

PRACTICAL NO: 5
Aim: Implement
mplement the SVM algorithm for binary classification. Train a SVM Model using
the given dataset.. Evaluate the performance on test data and analyze the results.
resu

Page | 16
TYCS SEM V USCSP501 ARTIFICIAL INTELLIGENCE BY: - PROF. HASAN PHUDINAWALA

Page | 17
TYCS SEM V USCSP501 ARTIFICIAL INTELLIGENCE BY: - PROF. HASAN PHUDINAWALA

Page | 18
TYCS SEM V USCSP501 ARTIFICIAL INTELLIGENCE BY: - PROF. HASAN PHUDINAWALA

Page | 19
TYCS SEM V USCSP501 ARTIFICIAL INTELLIGENCE BY: - PROF. HASAN PHUDINAWALA

Page | 20
TYCS SEM V USCSP501 ARTIFICIAL INTELLIGENCE BY: - PROF. HASAN PHUDINAWALA

Page | 21
TYCS SEM V USCSP501 ARTIFICIAL INTELLIGENCE BY: - PROF. HASAN PHUDINAWALA

PRACTICAL NO: 6
AIM: Adaboost Ensemble Learning
● Implement the Adaboost algorithm to create an ensemble of weak classifiers.
● Train the ensemble model on a given dataset and evaluate its performance
● Compare the results with individual weak classifiers

Requirement:

Code:
import pandas
from sklearn import model_selection
from sklearn.ensemble import AdaBoostClassifier
url = "https://raw.githubusercontent.com/jbrownlee/Datasets/master/pima
"https://raw.githubusercontent.com/jbrownlee/Datasets/master/pima-indians indians-
diabetes.data.csv"
names = ['preg', 'plas', 'pres', 'skin', 'test', 'mass', 'pedi', 'age', 'class']
dataframe = pandas.read_csv(url, names=names)
array = dataframe.values
X = array[:,0:8]
Y = array[:,8]
seed = 7
num_trees = 30
#kfold makes trees with split number.
#kfold = model_selection.KFold(n_splits=10, random_state=seed)
#n_estimators : This is the number of trees you want to build before predictions.
#Higher number of trees give you better voting optionsand perfomance performance
model = AdaBoostClassifier(n_estimators=num_trees,
ifier(n_estimators=num_trees, random_state=seed)
#cross_val_score method is used to calculate the accuracy of model sliced into x, y
#cross validator cv is optional cv=kfold
results = model_selection.cross_val_score(model, X, Y)
print(results.mean())

Output:

Page | 22
TYCS SEM V USCSP501 ARTIFICIAL INTELLIGENCE BY: - PROF. HASAN PHUDINAWALA

PRACTICAL NO: 7
AIM: Naive Bayes' Classifier
● Implement the Naive Bayes algorithm for classification.
● Trin a Naive Bayes' model using a given dataset and calculate class probabilities.
● Evaluate the accuracy of the model on test data and analyze the results.

Requirement: disease dataset

Code:

Page | 23
TYCS SEM V USCSP501 ARTIFICIAL INTELLIGENCE BY: - PROF. HASAN PHUDINAWALA

Page | 24
TYCS SEM V USCSP501 ARTIFICIAL INTELLIGENCE BY: - PROF. HASAN PHUDINAWALA

Page | 25
TYCS SEM V USCSP501 ARTIFICIAL INTELLIGENCE BY: - PROF. HASAN PHUDINAWALA

Page | 26
TYCS SEM V USCSP501 ARTIFICIAL INTELLIGENCE BY: - PROF. HASAN PHUDINAWALA

Page | 27
TYCS SEM V USCSP501 ARTIFICIAL INTELLIGENCE BY: - PROF. HASAN PHUDINAWALA

Page | 28
TYCS SEM V USCSP501 ARTIFICIAL INTELLIGENCE BY: - PROF. HASAN PHUDINAWALA

Page | 29
TYCS SEM V USCSP501 ARTIFICIAL INTELLIGENCE BY: - PROF. HASAN PHUDINAWALA

PRACTICAL NO: 8
Aim:- Implement the K-NN
NN Algorithm for classification or regression.
Apply K-NN Algorithm on the given dataset & predict the class or value for test data.

Page | 30
TYCS SEM V USCSP501 ARTIFICIAL INTELLIGENCE BY: - PROF. HASAN PHUDINAWALA

Page | 31
TYCS SEM V USCSP501 ARTIFICIAL INTELLIGENCE BY: - PROF. HASAN PHUDINAWALA

Page | 32
TYCS SEM V USCSP501 ARTIFICIAL INTELLIGENCE BY: - PROF. HASAN PHUDINAWALA

Page | 33
TYCS SEM V USCSP501 ARTIFICIAL INTELLIGENCE BY: - PROF. HASAN PHUDINAWALA

PRACTICAL No: 9
Aim: Implement the Association Rule Mining algorithm (e.g. Apriori) to find frequent
dataset. Generate association rules from the frequent item set and calculate their support.
su

Page | 34
TYCS SEM V USCSP501 ARTIFICIAL INTELLIGENCE BY: - PROF. HASAN PHUDINAWALA

Page | 35
TYCS SEM V USCSP501 ARTIFICIAL INTELLIGENCE BY: - PROF. HASAN PHUDINAWALA

Page | 36
TYCS SEM V USCSP501 ARTIFICIAL INTELLIGENCE BY: - PROF. HASAN PHUDINAWALA

Page | 37
TYCS SEM V USCSP501 ARTIFICIAL INTELLIGENCE BY: - PROF. HASAN PHUDINAWALA

Page | 38
TYCS SEM V USCSP501 ARTIFICIAL INTELLIGENCE BY: - PROF. HASAN PHUDINAWALA

Page | 39
TYCS SEM V USCSP501 ARTIFICIAL INTELLIGENCE BY: - PROF. HASAN PHUDINAWALA

Page | 40
TYCS SEM V USCSP501 ARTIFICIAL INTELLIGENCE BY: - PROF. HASAN PHUDINAWALA

Page | 41
TYCS SEM V USCSP501 ARTIFICIAL INTELLIGENCE BY: - PROF. HASAN PHUDINAWALA

Page | 42

You might also like