Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
Skip to main content

Kullback-Leibler Divergence

  • Reference work entry
  • First Online:
International Encyclopedia of Statistical Science

Kullback-Leibler divergence (Kullback 1951) is an information-based measure of disparity among probability distributions. Given distributions P and Q defined over X, with Q absolutely continuous with respect to P, the Kullback-Leibler divergence of Q from P is the P-expectation of \({-\log }_{2}\{P/Q\}.\ \mathrm{So},\ {D}_{KL}(P,Q) = -{\int \nolimits \nolimits {}_{\,\,\,X}\log }_{2}(Q(x)/P(x))dP.\) This quantity can be seen as the difference between the cross-entropy forQonP, H(P, Q) = − ∫​​​X log2(Q(x))dP, and the self-entropy (Shannon 1948) of P, H(P) = H(P, P) = − ∫​​​X log2(P(x))dP. Since H(P, Q) is the P-expectation of the number of bits of information, beyond those encoded in Q, that are needed to identify points in X, D KL (P, Q) = H(P) − H(P, Q) is the expected difference, from the perspective of P, between the information encoded in P and the information encoded in Q.

D KL has a number of features that make it plausible as a measure of probabilistic divergence. Here are some of its...

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 1,100.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Hardcover Book
USD 549.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References and Further Reading

  • Akaike H (1973) Information theory and an extension of the maximum likelihood principle. In: Petrov BN, Csaki F (eds) Proceedings of the international symposium on information theory. Budapest, Akademiai Kiado

    Google Scholar 

  • De Groot M (1962) Uncertainty, information, and sequential experiments. Ann Math Stat 33:404–419

    Article  Google Scholar 

  • Diaconis P, Zabell S (1982) Updating subjective probability. J Am Stat Assoc 77:822–830

    Article  MATH  MathSciNet  Google Scholar 

  • Jeffrey R (1965) The logic of decision. McGraw-Hill, New York

    Google Scholar 

  • Kullback S, Leibler RA (1951) On information and sufficiency. Ann Math Stat 22:79–86

    Article  MATH  MathSciNet  Google Scholar 

  • Lindley DV (1956) On the measure of information provided by an experiment. Ann Stat 27:985–1005

    MathSciNet  Google Scholar 

  • Shannon CE (1948) A mathematical theory of communication. AT&T Tech J 27(379–423):623–656

    MathSciNet  Google Scholar 

  • Sober E (2002) Instrumentalism, parsimony, and the Akaike framework. Philos Sci 69:S112–S123

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2011 Springer-Verlag Berlin Heidelberg

About this entry

Cite this entry

Joyce, J.M. (2011). Kullback-Leibler Divergence. In: Lovric, M. (eds) International Encyclopedia of Statistical Science. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-04898-2_327

Download citation

Publish with us

Policies and ethics