Probabilistic Graphical Model Handout
Probabilistic Graphical Model Handout
Course No(s)
Credit Units 4
Version 1.0
Course Objectives
No Course Objective
CO1 Introduce students to the basic concepts and techniques of Probabilistic Graphical Model.
CO2 Students will be able to compute conditional distributions from simple discrete
probabilistic models, like a fixed Naïve Bayes classifier, a finite mixture model or a Hidden
Markov Model.
CO3 Students will be able to explain the model structure learning problem and parameter
learning problem.
CO4 Students will be able to develop skills of using recent probabilistic graphical models to
evaluate learning algorithms.
Text Book(s)
T1 Mastering Probabilistic Graphical Models using Python by Ankur Ankan, Abhinash
Panda. Packt Publishing 2015.
Content Structure
1. Introduction
1.1. Objective of the course
1.2. Structured Probabilistic Models
2. Mathematical Preliminaries
2.1. Probability Theory
2.2. Graph
5. Exact Inference
5.1. Variable Inference
5.2. Belief Propagation
5.3. MAP using belief propagation
6. Approximate Inference
6.1. Propagation based approximation algorithm
6.2. Loopy Belief propagation
6.3. Sampling based approximate messages
6.4. Markov chain Monte Carlo methods
7. Parameter Learning
7.1. Parameter Estimation in Bayesian Networks
7.2. Maximum Likelihood Estimation
7.3. Parameter Estimation in Markov Networks
8. Structure Learning
8.1. Structure learning in Bayesian Networks
8.2. Constraint based structure learning
8.3. Score based constraint learning
9. Models
9.1. Naïve Bayes Model
9.2. Hidden Markov Model
Learning Outcomes:
No Learning Outcomes
LO3 Able to identify appropriate tools to implement the solutions to problems related
to Probabilistic Graphical Models and implement solutions.
Course No
Lead Instructor
Session Study / HW
No. Topic Title Resource
Reference
Introduction
1
Objective of the course, Structured Probabilistic Models, R1 – Ch1
Representation, Inference, Learning, Application of Probabilistic
Graphical Models.
Mathematical Preliminaries
2
Probability theory, Probability Distributions, Random Variables
and Joint Distributions, Independence and Conditional T1 – Ch1
Independence, Expectation and Variance T2 – Ch1
Graphs, Nodes and Edges, Subgraphs, Paths and Trails, Cycles and
Loops
Exact Inference
7
Variable elimination, Belief propagation, Constructing a clique
tree, MAP using variable elimination, Factor maximization, MAP T1 – Ch3
using belief propagation, Finding the most probable assignment,
Predictions from the model using pgmpy
Books, Web
8
Review of Session 1 to 7 references and
Slides
Approximate Inference
9
Exact inference as an optimization, Propagation-based
approximation algorithm, Loopy Belief propagation , Propagation T1 – Ch4
with approximate messages, Sampling-based approximate T2 – Ch7
methods, Markov chain Monte Carlo methods, Using a Markov
chain
Parameter Learning
10
General ideas in learning, Learning as an optimization, Maximum T1 – Ch5, Ch6
likelihood estimation, Parameter Estimation in Bayesian T2 – Ch5
Networks, MLE for Bayesian networks
Structure Learning
12
Structure learning in Bayesian networks, Methods for the T1 – Ch5, Ch6
learning structure, Constraint-based structure learning, Structure T2 – Ch4
score learning, Bayesian score for Bayesian networks
Books, Web
16
Review of session 9 to 15 references and
Slides
Evaluation Scheme:
Legend: EC = Evaluation Component; AN = After Noon Session; FN = Fore Noon Session
Note:
Syllabus for Mid-Semester Test (Closed Book): Topics in Session Nos. 1 to 8
Syllabus for Comprehensive Exam (Open Book): All topics (Session Nos. 1 to 16)
Evaluation Guidelines:
1. EC-1 consists of two Quizzes. Students will attempt them through the course
pages on the Elearn portal. Announcements will be made on the portal, in a
timely manner.
2. EC-2 consists of either one or two Assignments. Students will attempt them
through the course pages on the Elearn portal. Announcements will be made
on the portal, in a timely manner.
3. For Closed Book tests: No books or reference material of any kind will be
permitted.
4. For Open Book exams: Use of books and any printed / written reference
material (filed or bound) is permitted. However, loose sheets of paper will
not be allowed. Use of calculators is permitted in all exams. Laptops/Mobiles
of any kind are not allowed. Exchange of any material is not allowed.
5. If a student is unable to appear for the Regular Test/Exam due to genuine
exigencies, the student should follow the procedure to apply for the Make-Up
Test/Exam which will be made available on the Elearn portal. The Make-Up
Test/Exam will be conducted only at selected exam centres on the dates to be
announced later.