Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
0% found this document useful (0 votes)
139 views

Bayesian Data Analysis

Bayesian inference uses Bayes' theorem to update probabilities based on new evidence. It involves setting up prior and posterior distributions and evaluating model fit. Bayes' theorem describes the conditional probability of events given prior knowledge. The total probability rule and normalization are also important to Bayesian inference. Naive Bayes classifiers make independence assumptions between features to classify items. Bayesian networks visually represent conditional dependencies between random variables and are used for probability inference.

Uploaded by

amuthamukil
Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
139 views

Bayesian Data Analysis

Bayesian inference uses Bayes' theorem to update probabilities based on new evidence. It involves setting up prior and posterior distributions and evaluating model fit. Bayes' theorem describes the conditional probability of events given prior knowledge. The total probability rule and normalization are also important to Bayesian inference. Naive Bayes classifiers make independence assumptions between features to classify items. Bayesian networks visually represent conditional dependencies between random variables and are used for probability inference.

Uploaded by

amuthamukil
Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 14

It deals with a set of practical methods for making

inferences from the available data.


Three Steps

 Setting up the prior distribution


 Setting up the posterior distribution
 Evaluating the fit of the model
Bayesian inference
 It is a method of statistical inference in which Bayes’s
theorem is used to update the probability for a
hypothesis.
 It depends on three important theorems:
 Bayes’ theorem
 Law of total probability
 Normalization.
Bayes’ theorem
 In statistics and probability theory, the
Bayes’ theorem (also known as the Bayes’
rule) is a mathematical formula used to
determine the conditional probability of
events. Essentially, the Bayes’ theorem
describes the probability of an event based
on prior knowledge of the conditions that
might be relevant to the event.
Formula
 The Bayes’ theorem is expressed in the following formula:

 P(AIB)= P(BIA)P(A)
 P(B)
 Where:
 P(A|B) – the probability of event A occurring, given event B
has occurred
 P(B|A) – the probability of event B occurring, given event A
has occurred
 P(A) – the probability of event A
 P(B) – the probability of event B
LAW OF TOTAL PROBABILITY
 The total probability rule (also called the Law of
Total Probability) breaks up probability
calculations into distinct parts. It's used to find the
probability of an event, A, when you don't know
enough about A's probabilities to calculate it directly.
... The total probability rule is: P(A) = P(A∩B) +
P(A∩Bc)
NORMALIZATION
 Hp∑P(HiI…)=1
 H stands for any hypothesis whose probability may be
affected (called evidence below).
 The evidence E corresponds to new unseen data
 P(H) refers to prior probability.
 P(HIE) refers to the posterior probability
NAÏVE BAYES CLASSIFIER
Principle of Naive Bayes Classifier:
 A Naive Bayes classifier is a probabilistic machine
learning model that’s used for classification task. The
crux of the classifier is based on the Bayes theorem.
BAYES THEOREM
 P(AIB)= P(BIA)P(A)
 P(B)
 Using Bayes theorem, we can find the probability of A
happening, given that B has occurred. Here, B is the
evidence and A is the hypothesis. The assumption
made here is that the predictors/features are
independent. That is presence of one particular feature
does not affect the other. Hence it is called naive.
ADVANTAGES
 It is very easy and fast.
 Test is straightforward .
APPLICATIONS
 Spam mail filtering ,sentiment analysis, news article
categorization.
BAYESIAN NETWORKS
It has the following features:
 Specifies which conditional independence
assumptions are valid.
 Provides sets of conditional probabilities to specify the
joint probability distributions wherever dependencies
exist.
BAYESIAN BELIEF NETWORKS-BBN

BBN or simply Bayesian Network, is a statistical


model used to describe the conditional dependencies
between different random variables.
EXAMPLE
LUCKY
STUDY

GOOD GRADES

HIGHER
JOB
STUDIES
Purpose of the BBN model
 Determine the posterior probability distribution for a
set of query variable given a set of observed events.
Two important results are used to perform inference
in a BBN:
 First is the chain rule, which is derived from repeated
applications of the product rule,
 The other result is a lemma which can be used to
introduce new variables when there is no apparent
dependency among the variables.

You might also like