Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
0% found this document useful (0 votes)
173 views

Probabilistic Graphical Model Handout

This document provides information about a course on Probabilistic Graphical Models including: - The course objectives are to introduce students to probabilistic graphical models and teach skills in computing conditional distributions, explaining model structure and parameter learning problems, and using models to evaluate learning algorithms. - The content includes introductions to directed and undirected graphical models, exact and approximate inference, parameter learning, and structure learning. Example models like Naive Bayes and Hidden Markov Models are also covered. - The learning plan lists 16 sessions covering topics like Bayesian networks, Markov networks, inference techniques, learning, and implementing models in Python. Evaluation includes quizzes, assignments, a mid-semester test, and comprehensive exam.

Uploaded by

sdfasd
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
173 views

Probabilistic Graphical Model Handout

This document provides information about a course on Probabilistic Graphical Models including: - The course objectives are to introduce students to probabilistic graphical models and teach skills in computing conditional distributions, explaining model structure and parameter learning problems, and using models to evaluate learning algorithms. - The content includes introductions to directed and undirected graphical models, exact and approximate inference, parameter learning, and structure learning. Example models like Naive Bayes and Hidden Markov Models are also covered. - The learning plan lists 16 sessions covering topics like Bayesian networks, Markov networks, inference techniques, learning, and implementing models in Python. Evaluation includes quizzes, assignments, a mid-semester test, and comprehensive exam.

Uploaded by

sdfasd
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 6

BIRLA INSTITUTE OF TECHNOLOGY & SCIENCE, PILANI

WORK INTEGRATED LEARNING PROGRAMMES


Digital
Part A: Content Design
Course Title Probabilistic Graphical Model

Course No(s)
Credit Units 4

Credit Model 1 - 0.5 - 1.5.


1 unit for class room hours, 0.5 unit for Tutorial, 1.5 units for
Student preparation. 1 unit = 32 hours

Content Authors Ms. Seetha Parameswaran

Version 1.0

Date June 22th, 2019

Course Objectives
No Course Objective

CO1 Introduce students to the basic concepts and techniques of Probabilistic Graphical Model.

CO2 Students will be able to compute conditional distributions from simple discrete
probabilistic models, like a fixed Naïve Bayes classifier, a finite mixture model or a Hidden
Markov Model.

CO3 Students will be able to explain the model structure learning problem and parameter
learning problem.

CO4 Students will be able to develop skills of using recent probabilistic graphical models to
evaluate learning algorithms.

Text Book(s)
T1 Mastering Probabilistic Graphical Models using Python by Ankur Ankan, Abhinash
Panda. Packt Publishing 2015.

T2 Building Probabilistic Graphical Models with Python by Kiran R Karkera. Packt


Publishing 2014.
Reference Book(s) & other resources
R1 Probabilistic Graphical Models: Principles and Techniques by Daphne Koller and
Nir Friedman. MIT Press. 2009

R2 Learning in Graphical Models by Michael I. Jordan. MIT Press. 1999

Content Structure
1. Introduction
1.1. Objective of the course
1.2. Structured Probabilistic Models

2. Mathematical Preliminaries
2.1. Probability Theory
2.2. Graph

3. Directed Graphical Models


3.1. Bayes Networks
3.2. D-separation
3.3. I-map

4. Undirected Graphical Models


4.1. Markov Networks
4.2. Gibbs distributions
4.3. Factorization

5. Exact Inference
5.1. Variable Inference
5.2. Belief Propagation
5.3. MAP using belief propagation

6. Approximate Inference
6.1. Propagation based approximation algorithm
6.2. Loopy Belief propagation
6.3. Sampling based approximate messages
6.4. Markov chain Monte Carlo methods

7. Parameter Learning
7.1. Parameter Estimation in Bayesian Networks
7.2. Maximum Likelihood Estimation
7.3. Parameter Estimation in Markov Networks

8. Structure Learning
8.1. Structure learning in Bayesian Networks
8.2. Constraint based structure learning
8.3. Score based constraint learning

9. Models
9.1. Naïve Bayes Model
9.2. Hidden Markov Model
Learning Outcomes:
No Learning Outcomes

LO1 Able to understand the basics of Probabilistic Graphical Models.

LO2 Able to solve problems related to Probabilistic Graphical Models using


appropriate learning techniques.

LO3 Able to identify appropriate tools to implement the solutions to problems related
to Probabilistic Graphical Models and implement solutions.

Part B: Learning Plan


Academic Term

Course Title Probabilistic Graphical Model

Course No
Lead Instructor

Session Study / HW
No. Topic Title Resource
Reference

Introduction
1
Objective of the course, Structured Probabilistic Models, R1 – Ch1
Representation, Inference, Learning, Application of Probabilistic
Graphical Models.

Mathematical Preliminaries
2
Probability theory, Probability Distributions, Random Variables
and Joint Distributions, Independence and Conditional T1 – Ch1
Independence, Expectation and Variance T2 – Ch1
Graphs, Nodes and Edges, Subgraphs, Paths and Trails, Cycles and
Loops

Directed Graphical Models


3
Independence and independent parameters, Bayesian models, T1 – Ch1
Representation, Factorization of a distribution over a network,
Bayesian model representation

Directed Graphical Models (contd)


4
D-separation, IMAP, IMAP to factorization, CPD representations, T1 – Ch1
Implementing Bayesian networks using pgmpy
Undirected Graphical Models
5
Markov network, Parameterizing a Markov network – factor, T1 – Ch2
Factor operations, Gibbs distributions and Markov networks,
Factor graph

Undirected Graphical Models (contd)


6
Independencies in Markov networks, Constructing graphs from T1 – Ch2
distributions
Bayesian and Markov networks

Exact Inference
7
Variable elimination, Belief propagation, Constructing a clique
tree, MAP using variable elimination, Factor maximization, MAP T1 – Ch3
using belief propagation, Finding the most probable assignment,
Predictions from the model using pgmpy

Books, Web
8
Review of Session 1 to 7 references and
Slides

Approximate Inference
9
Exact inference as an optimization, Propagation-based
approximation algorithm, Loopy Belief propagation , Propagation T1 – Ch4
with approximate messages, Sampling-based approximate T2 – Ch7
methods, Markov chain Monte Carlo methods, Using a Markov
chain

Parameter Learning
10
General ideas in learning, Learning as an optimization, Maximum T1 – Ch5, Ch6
likelihood estimation, Parameter Estimation in Bayesian T2 – Ch5
Networks, MLE for Bayesian networks

Parameter Learning (contd)


11 T1 – Ch5, Ch6
Parameter Estimation in Markov Networks, MLE for Markov T2 – Ch5
models

Structure Learning
12
Structure learning in Bayesian networks, Methods for the T1 – Ch5, Ch6
learning structure, Constraint-based structure learning, Structure T2 – Ch4
score learning, Bayesian score for Bayesian networks

Structure Learning (contd)


13 T1 – Ch5, Ch6
Structure learning in Markov Models, Constraint-based structure T2 – Ch4
learning, Structure score learning
Naïve Bayes Model, Implementation T1 – Ch7
14

Hidden Markov Model, Implementation T1 – Ch7


15

Books, Web
16
Review of session 9 to 15 references and
Slides

Detailed Plan for Lab work

Lab Lab Sheet Access Session


Lab Objective Reference
No. URL
Bayesian model representation 4
1
Markov Model representation 6
2
MAP on Bayesian model 7
3
MLE on Bayesian Model 10
4
MLE on Markov Model 11
5
Learning Structure in Bayesian Model 12
6

Evaluation Scheme:
Legend: EC = Evaluation Component; AN = After Noon Session; FN = Fore Noon Session

Name Type Duration Weight Day, Date, Session, Time


No
Quizzes Online 10%
EC-1
Assignments Take Home 20%
EC-2
Mid-Semester Test Closed Book 1.5 Hrs 30%
EC-3
Comprehensive Open Book 2.5 Hrs 40%
EC-4
Exam

Note:
Syllabus for Mid-Semester Test (Closed Book): Topics in Session Nos. 1 to 8
Syllabus for Comprehensive Exam (Open Book): All topics (Session Nos. 1 to 16)

Important links and information:

Elearn portal: https://elearn.bits-pilani.ac.in or Canvas


Students are expected to visit the Elearn portal on a regular basis and stay up to date
with the latest announcements and deadlines.
Contact sessions: Students should attend the online lectures as per the schedule provided
on the Elearn portal.

Evaluation Guidelines:
1. EC-1 consists of two Quizzes. Students will attempt them through the course
pages on the Elearn portal. Announcements will be made on the portal, in a
timely manner.
2. EC-2 consists of either one or two Assignments. Students will attempt them
through the course pages on the Elearn portal. Announcements will be made
on the portal, in a timely manner.
3. For Closed Book tests: No books or reference material of any kind will be
permitted.
4. For Open Book exams: Use of books and any printed / written reference
material (filed or bound) is permitted. However, loose sheets of paper will
not be allowed. Use of calculators is permitted in all exams. Laptops/Mobiles
of any kind are not allowed. Exchange of any material is not allowed.
5. If a student is unable to appear for the Regular Test/Exam due to genuine
exigencies, the student should follow the procedure to apply for the Make-Up
Test/Exam which will be made available on the Elearn portal. The Make-Up
Test/Exam will be conducted only at selected exam centres on the dates to be
announced later.

It shall be the responsibility of the individual student to be regular in maintaining


the self-study schedule as given in the course hand-out, attend the online lectures,
and take all the prescribed evaluation components such as Assignment/Quiz, Mid-
Semester Test and Comprehensive Exam according to the evaluation scheme
provided in the hand-out.

You might also like