Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
An Entity of Type: software, from Named Graph: http://dbpedia.org, within Data Space: dbpedia.org

An information diagram is a type of Venn diagram used in information theory to illustrate relationships among Shannon's basic measures of information: entropy, joint entropy, conditional entropy and mutual information. Information diagrams are a useful pedagogical tool for teaching and learning about these basic measures of information. Information diagrams have also been applied to specific problems such as for displaying the information theoretic similarity between sets of ontological terms. * *

Property Value
dbo:abstract
  • An information diagram is a type of Venn diagram used in information theory to illustrate relationships among Shannon's basic measures of information: entropy, joint entropy, conditional entropy and mutual information. Information diagrams are a useful pedagogical tool for teaching and learning about these basic measures of information. Information diagrams have also been applied to specific problems such as for displaying the information theoretic similarity between sets of ontological terms. * Venn diagram showing additive and subtractive relationships among various information measures associated with correlated variables X and Y. The area contained by both circles is the joint entropy . The circle on the left (red and violet) is the individual entropy , with the red being the conditional entropy . The circle on the right (blue and violet) is , with the blue being . The violet is the mutual information . * Venn diagram of information theoretic measures for three variables x, y, and z. Each circle represents an individual entropy: is the lower left circle, the lower right, and is the upper circle. The intersections of any two circles represents the mutual information for the two associated variables (e.g. is yellow and gray). The union of any two circles is the joint entropy for the two associated variables (e.g. is everything but green). The joint entropy of all three variables is the union of all three circles. It is partitioned into 7 pieces, red, blue, and green being the conditional entropies respectively, yellow, magenta and cyan being the conditional mutual informations and respectively, and gray being the interaction information . The interaction information is the only one of all that may be negative. (en)
dbo:thumbnail
dbo:wikiPageID
  • 18372173 (xsd:integer)
dbo:wikiPageLength
  • 3339 (xsd:nonNegativeInteger)
dbo:wikiPageRevisionID
  • 1062726719 (xsd:integer)
dbo:wikiPageWikiLink
dbp:wikiPageUsesTemplate
dcterms:subject
gold:hypernym
rdf:type
rdfs:comment
  • An information diagram is a type of Venn diagram used in information theory to illustrate relationships among Shannon's basic measures of information: entropy, joint entropy, conditional entropy and mutual information. Information diagrams are a useful pedagogical tool for teaching and learning about these basic measures of information. Information diagrams have also been applied to specific problems such as for displaying the information theoretic similarity between sets of ontological terms. * * (en)
rdfs:label
  • Information diagram (en)
owl:sameAs
prov:wasDerivedFrom
foaf:depiction
foaf:isPrimaryTopicOf
is dbo:wikiPageRedirects of
is dbo:wikiPageWikiLink of
is foaf:primaryTopic of
Powered by OpenLink Virtuoso    This material is Open Knowledge     W3C Semantic Web Technology     This material is Open Knowledge    Valid XHTML + RDFa
This content was extracted from Wikipedia and is licensed under the Creative Commons Attribution-ShareAlike 3.0 Unported License