Machine Learning
Machine Learning
CATEGORY L T P CREDIT
ECT463 MACHINE LEARNING
PEC 2 1 0 3
Preamble: This course aims to impart the fundamentals of machine learning techniques.
Course Outcomes: After the completion of the course the student will be able to
CO2 K3 Differentiate regression and classification, apply Bayes’ decision theory in classification
CO3 K3 Apply linear algebra and statistical methods in discriminant based algorithms
PO PO PO PO PO PO PO PO PO1
PO1 PO11 PO12
2 3 4 5 6 7 8 9 0
CO 1 3
CO 2 3 3 3 3 3
CO 3 3 3 3 3 3
CO 4 3
CO 5 3 3 3
Assessment Pattern
End Semester Examination Pattern: There will be two parts; Part A and Part B. Part A
contain 10 questions with 2 questions from each module, having 3 marks for each question.
Students should answer all questions. Part B contains 2 questions from each module of which
student should answer any one. Each question can have maximum 2 sub-divisions and carry
14 marks.
Course Outcome 1 (CO1): Understand the basics of machine learning and different types.
(K2)
Course Outcome 2 (CO2): Differentiate regression and classification, apply Bayes’ decision
theory in classification (K3)
Course Outcome 3 (CO3): Apply linear algebra and statistical methods in discriminant
based algorithms (K3)
2. Use support vector machines for separable classes and non separable classes
SYLLABUS
Module I
Basics of machine learning, supervised and unsupervised learning, examples, features, feature
vector, training set, target vector, test set, feature extraction, over-fitting, curse of
dimensionality. Review of probability theory, Gaussian distribution, decision theory.
Module II
Regression: linear regression, error functions in regression, multivariate regression,
regression applications, bias and variance. Classification : Bayes’ decision theory,
discriminant functions and decision surfaces, Bayesian classification for normal distributions,
classification applications.
Module III
Linear discriminant based algorithm: perceptron, gradient descent method, perceptron
algorithm, support vector machines, separable classes, non-separable classes, multiclass case.
Module IV :
Unsupervised learning: Clustering, examples, criterion functions for clustering, proximity
measures, algorithms for clustering. Ensemble methods: boosting, bagging. Basics of
decision trees, random forest, examples.
Module V :
Dimensionality reduction: principal component analysis, Fischer's discriminant analysis.
Evaluation and model Selection: ROC curves, evaluation measures, validation set, bias-
variance trade-off. Confusion matrix, recall, precision, accuracy.
No Topic No. of
Lectures
1 Module I
1.1 Basics of machine learning, supervised and unsupervised learning, examples, 2
1.2 features, feature vector, training set, target vector, test set 2
1.3 feature extraction, over-fitting, curse of dimensionality. 1
1.4 Review of probability theory, Gaussian distribution, decision theory. 2
2 Module II
2.1 Regression: linear regression, error functions in regression 2
2.2 multivariate regression, regression applications, bias and variance. 2
2.3 Classification : Bayes’ decision theory, 1
2.4 discriminant functions and decision surfaces, 1
2.5 Bayesian classification for normal distributions, classification applications. 2
3 Module III
3.1 Linear discriminant based algorithm: perceptron, 1
3.2 gradient descent method, perceptron algorithm, 2
3.3 support vector machines , 1
3.4 SVM for separable classes and non-separable classes, multiclass case. 2
4 Module IV
4.1 Unsupervised learning: Clustering, examples, criterion functions for 2
5 Module V
5.1 Dimensionality reduction: principal component analysis, 2
5.2 Fischer's discriminant analysis. 1
5.3 Evaluation and model selection: ROC curves, evaluation measures, 2
5.4 validation set, bias-variance trade-off. 1
5.5 confusion matrix, recall, precision, accuracy. 1
• Regression examples
• Classification examples
• Perceptron
• SVM
PART A
Answer all questions. Each question carries 3 marks.
1. Explain machine learning with examples.
2. Explain over-fitting in machine learning
3. Explain regression with examples
4. State Bayes decision theory
5. Draw a simple perceptron model
6. How SVM is used for multiclass problem?
7. Explain clustering with examples.
8. Explain decision trees with examples.
9. Explain ROC curves.
10. Explain bias-variance trade-off.
PART B
Answer anyone question from each module. Each question carries 14 marks.
MODULE I
11. (a) Explain the terms features, training set, target vector, and test set (8 marks)
(b) Distinguish supervised and unsupervised machine learning with examples. (6 marks)
OR
12. (a) Explain a multi-variate Gaussian distribution along with its parameters (6 marks)
(b) Explain curse of dimensionality in machine learning? (8 marks)
MODULE II
13. (a) Differentiate regression and classification with examples (8 marks)
(b) Explain bias and variance for regression (6 marks)
OR
14. (a) Obtain the decision surface for an equi-probable two class system, where the
probability density functions of l-dimensional feature vectors in both classes are
normally distributed. (8 marks)
(b) Show that the Bayesian classifier is optimal with respect to minimizing the
classification error probability. (6 marks)
MODULE IV
13. (a) Explain different criterion functions for clustering (8 marks)
(b) Give a description of different clustering algorithms (6 marks)
OR
14. (a) Explain different ensemble methods in classification. (8 marks)
(b) Illustrate random forest algorithm. (6 marks)
MODULE V
13. (a) Explain the significance of dimensionality reduction in machine learning. (6 marks)
(b) Describe Fisher Discriminant Analysis. (8 marks)
OR
14. (a) How performance evaluation and model selection is done in machine learning
(8 marks)
(b) Explain confusion matrix, recall, precision, and accuracy. (6 marks)