Gibbs Algorithm
Gibbs Algorithm
GIBBS ALGORITHM
• Bayes optimal classifier provides best optimal.
But it can be quite costly (if there are many
hypotheses)to apply due to it computes the
posterior probability for every hypothesis in H
and then combines the predictions of each
hypothesis to classify new instance
• Gibbs algorithm is an alternative to bayes
optimal classifier
• It provides less optimal defined as follows:
1. choose a hypothesis h from H at random,
according to the posterior probability
distribution over H.
2. Use h to predict the classification of the next
instance X.
• Given a new instance to classify, the Gibbs
algorithm simply applies a hypothesis drawn at
random according to the current posterior
probability distribution.
• Surprisingly , under certain conditions the
expected misclassification error for the Gibbs
algorithm is at most twice the expected error of
thhe Bayes optimal classifier.
• More precisely, the expected value is taken over
target concepts drawn at random according to
the prior probability distribution assumed by
the learner
• Limitation: the expected value of the error of
Gibbs algorithm is at worst twice the
expected value of the error of the Bayes
optimal classifier.
E[errorGibbs] ≤ 2E[errorBayesoptimal]