Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
Skip header Section
Entropy and information theoryAugust 1990
Publisher:
  • Springer-Verlag
  • Berlin, Heidelberg
ISBN:978-0-387-97371-5
Published:01 August 1990
Pages:
332
Skip Bibliometrics Section
Reflects downloads up to 08 Feb 2025Bibliometrics
Abstract

No abstract available.

Cited By

  1. Adil Khan M, Ullah H, Saeed T, Sayed Z, Alshaikey S, Mahmoud E and Busiello D (2024). Determination of Novel Estimations for the Slater Difference and Applications, Complexity, 2024, Online publication date: 1-Jan-2024.
  2. Theocharous A, Gregoriou G, Sapountzis P and Kontoyiannis I (2024). Temporally Causal Discovery Tests for Discrete Time Series and Neural Spike Trains, IEEE Transactions on Signal Processing, 72, (1333-1347), Online publication date: 1-Jan-2024.
  3. Thungtong A, Scher M and Loparo K Neurodevelopment in newborns as quantified by synchronization in the Electroencephalogram 2016 IEEE Conference on Computational Intelligence in Bioinformatics and Computational Biology (CIBCB), (1-6)
  4. Lyubushin A (2022). Field of coherence of GPS-measured earth tremors, GPS Solutions, 23:4, (1-13), Online publication date: 1-Oct-2019.
  5. Ton That D, Wagner J, Rasin A and Malik T (2019). PLI$$^+$$+, Distributed and Parallel Databases, 37:1, (177-208), Online publication date: 1-Mar-2019.
  6. Kara A and Yüksel S Robustness to Incorrect System Models in Stochastic Control and Application to Data-Driven Learning 2018 IEEE Conference on Decision and Control (CDC), (2753-2758)
  7. Devran Kara A and Yüksel S Robustness to Incorrect Priors in Infinite Horizon Stochastic Control 2018 IEEE Conference on Decision and Control (CDC), (5765-5770)
  8. Lim S, Feng C, Pastore A, Nazer B and Gastpar M (2018). A Joint Typicality Approach to Compute–Forward, IEEE Transactions on Information Theory, 64:12, (7657-7685), Online publication date: 1-Dec-2018.
  9. Asadi D and Atkins E (2018). Multi-Objective Weight Optimization for Trajectory Planning of an Airplane with Structural Damage, Journal of Intelligent and Robotic Systems, 91:3-4, (667-690), Online publication date: 1-Sep-2018.
  10. Koliander G, Schuhmacher D and Hlawatsch F (2018). Rate-Distortion Theory of Finite Point Processes, IEEE Transactions on Information Theory, 64:8, (5832-5861), Online publication date: 1-Aug-2018.
  11. Elmoslimany A and Duman T (2018). On the Discreteness of Capacity-Achieving Distributions for Fading and Signal-Dependent Noise Channels With Amplitude-Limited Inputs, IEEE Transactions on Information Theory, 64:2, (1163-1177), Online publication date: 1-Feb-2018.
  12. Mahmoodzadeh Z, Balali S and Mosleh A Entropy Based Method for Identification of Leading Risk Indicators 2018 Annual Reliability and Maintainability Symposium (RAMS), (1-6)
  13. Gehrig S, Schneider N, Stalder R and Franke U (2017). Stereo vision during adverse weather Using priors to increase robustness in real-time stereo vision, Image and Vision Computing, 68:C, (28-39), Online publication date: 1-Dec-2017.
  14. Mao W and Hassibi B (2017). Capacity Analysis of Discrete Energy Harvesting Channels, IEEE Transactions on Information Theory, 63:9, (5850-5885), Online publication date: 1-Sep-2017.
  15. ACM
    Mutlu B, Veas E and Trattner C Tags, Titles or Q&As? Proceedings of the 28th ACM Conference on Hypertext and Social Media, (265-274)
  16. Tutuncuoglu K, Ozel O, Yener A and Ulukus S (2017). The Binary Energy Harvesting Channel With a Unit-Sized Battery, IEEE Transactions on Information Theory, 63:7, (4240-4256), Online publication date: 1-Jul-2017.
  17. Ghattas B, Michel P and Boyer L (2017). Clustering nominal data using unsupervised binary decision trees, Pattern Recognition, 67:C, (177-185), Online publication date: 1-Jul-2017.
  18. Dörpinghaus M, Roldánt É, Nerit I, Meyr H and Jülicher F An information theoretic analysis of sequential decision-making 2017 IEEE International Symposium on Information Theory (ISIT), (3050-3054)
  19. Coleman T Dynamical systems, ergodicity, and posterior matching 2017 IEEE International Symposium on Information Theory (ISIT), (2678-2682)
  20. Mao W, Diggavi S and Kannan S Models and information-theoretic bounds for nanopore sequencing 2017 IEEE International Symposium on Information Theory (ISIT), (2458-2462)
  21. Ramos S, Gehrig S, Pinggera P, Franke U and Rother C Detecting unexpected obstacles for self-driving cars: Fusing deep learning and geometric modeling 2017 IEEE Intelligent Vehicles Symposium (IV), (1025-1032)
  22. ACM
    Ganor A, Kol G and Raz R (2016). Exponential Separation of Information and Communication for Boolean Functions, Journal of the ACM, 63:5, (1-31), Online publication date: 20-Dec-2016.
  23. Acharya U, Chowriappa P, Fujita H, Bhat S, Dua S, Koh J, Eugene L, Kongmebhol P and Ng K (2016). Thyroid lesion classification in 242 patient population using Gabor transform features from high resolution ultrasound images, Knowledge-Based Systems, 107:C, (235-245), Online publication date: 1-Sep-2016.
  24. Phan-Minh Nguyen and Armand M (2015). On Capacity Formulation With Stationary Inputs and Application to a Bit-Patterned Media Recording Channel Model, IEEE Transactions on Information Theory, 61:11, (5906-5930), Online publication date: 1-Nov-2015.
  25. Holena M, Bajer L and Scavnicky M (2015). Using Copulas in Data Mining Based on the Observational Calculus, IEEE Transactions on Knowledge and Data Engineering, 27:10, (2851-2864), Online publication date: 1-Oct-2015.
  26. ACM
    Leonard P and Jackson D Efficient Evolution of High Entropy RNGs Using Single Node Genetic Programming Proceedings of the 2015 Annual Conference on Genetic and Evolutionary Computation, (1071-1078)
  27. Silva J and Derpich M (2015). On the Characterization of $\ell _{p}$-Compressible Ergodic Sequences, IEEE Transactions on Signal Processing, 63:11, (2915-2928), Online publication date: 1-Jun-2015.
  28. ACM
    Duchi J, Jordan M and Wainwright M (2014). Privacy Aware Learning, Journal of the ACM, 61:6, (1-57), Online publication date: 17-Dec-2014.
  29. ACM
    Zand A, Vigna G, Yan X and Kruegel C Extracting probable command and control signatures for detecting botnets Proceedings of the 29th Annual ACM Symposium on Applied Computing, (1657-1662)
  30. Duchi J, Jordan M and Wainwright M Privacy aware learning Proceedings of the 26th International Conference on Neural Information Processing Systems - Volume 1, (1430-1438)
  31. ACM
    Sbert M, Feixas M, Viola I, Rigau J and Chover M Information theory in computer graphics and visualization SIGGRAPH Asia 2011 Courses, (1-58)
  32. ACM
    Oh J, Kim T, Park S, Han W and Yu H Dynamic concept ontology construction for pubmed queries Proceedings of the ACM fourth international workshop on Data and text mining in biomedical informatics, (65-66)
  33. Silva J and Narayanan S (2010). Nonproduct data-dependent partitions for mutual information estimation, IEEE Transactions on Signal Processing, 58:7, (3497-3511), Online publication date: 1-Jul-2010.
  34. Effros M, Goldsmith A and Liang Y (2010). Generalizing capacity, IEEE Transactions on Information Theory, 56:7, (3069-3087), Online publication date: 1-Jul-2010.
  35. Timo R, Blackmore K and Hanlen L (2010). Word-valued sources, IEEE Transactions on Information Theory, 56:7, (3139-3148), Online publication date: 1-Jul-2010.
  36. Yang E and He D (2010). Interactive encoding and decoding for one way learning, IEEE Transactions on Information Theory, 56:4, (1808-1824), Online publication date: 1-Apr-2010.
  37. Da Cunha A, Do M and Vetterli M (2010). On the information rates of the plenoptic function, IEEE Transactions on Information Theory, 56:3, (1306-1321), Online publication date: 1-Mar-2010.
  38. Parker A, Dimitrov A and Gedeon T (2010). Symmetry breaking in soft clustering decoding of neural codes, IEEE Transactions on Information Theory, 56:2, (901-927), Online publication date: 1-Feb-2010.
  39. Tsai C, Hsieh C and Lew K Detection of wind turbine blades damage by spectrum-recognition using Gaussian wavelet-entropy Proceedings of the 3rd international conference on Anti-Counterfeiting, security, and identification in communication, (108-113)
  40. Silva J and Narayanan S (2009). Discriminative wavelet packet filter bank selection for pattern recognition, IEEE Transactions on Signal Processing, 57:5, (1796-1810), Online publication date: 1-May-2009.
  41. Schönhuth A and Jaeger H (2009). Characterization of ergodic hidden Markov sources, IEEE Transactions on Information Theory, 55:5, (2107-2118), Online publication date: 1-May-2009.
  42. Schönhuth A (2009). On analytic properties of entropy rate, IEEE Transactions on Information Theory, 55:5, (2119-2127), Online publication date: 1-May-2009.
  43. Raginsky M (2009). Joint universal lossy coding and identification of stationary mixing sources with general alphabets, IEEE Transactions on Information Theory, 55:5, (1945-1960), Online publication date: 1-May-2009.
  44. Sethuraman V, Wang L, Hajek B and Lapidoth A (2009). Low-SNR capacity of noncoherent fading channels, IEEE Transactions on Information Theory, 55:4, (1555-1574), Online publication date: 1-Apr-2009.
  45. Chen X and Schmid N (2009). Empirical capacity of a recognition channel for single-and multipose object recognition under the constraint of PCA encoding, IEEE Transactions on Image Processing, 18:3, (636-651), Online publication date: 1-Mar-2009.
  46. Warnquist H, Nyberg M and Säby P Troubleshooting when Action Costs are Dependent with Application to a Truck Engine Proceedings of the 2008 conference on Tenth Scandinavian Conference on Artificial Intelligence: SCAI 2008, (68-75)
  47. Assent I, Krieger R, Welter P, Herbers J and Seidl T SubClass Proceedings of the 12th Pacific-Asia conference on Advances in knowledge discovery and data mining, (40-52)
  48. Gedeon T, Parker A, Campion C and Aldworth Z (2008). Annealing and the normalized N-cut, Pattern Recognition, 41:2, (592-606), Online publication date: 1-Feb-2008.
  49. Dukkipati A, Bhatnagar S and Narasimha Murty M (2007). Gelfand-Yaglom-Perez theorem for generalized relative entropy functionals, Information Sciences: an International Journal, 177:24, (5707-5714), Online publication date: 20-Dec-2007.
  50. Jain A, Chang E and Wang Y Bayesian reasoning for sensor group-queries and diagnosis Proceedings of the 12th international conference on Database systems for advanced applications, (522-538)
  51. Tian Y, Yang Q, Huang T, Ling C and Gao W (2006). Learning Contextual Dependency Network Models for Link-Based Classification, IEEE Transactions on Knowledge and Data Engineering, 18:11, (1482-1496), Online publication date: 1-Nov-2006.
  52. Thomassey S and Fiordaliso A (2006). A hybrid sales forecasting system based on clustering and decision trees, Decision Support Systems, 42:1, (408-421), Online publication date: 1-Oct-2006.
  53. Gupta M, Gray R and Olshen R (2006). Nonparametric Supervised Learning by Linear Interpolation with Maximum Entropy, IEEE Transactions on Pattern Analysis and Machine Intelligence, 28:5, (766-781), Online publication date: 1-May-2006.
  54. Aur D, Connolly C and Jog M (2006). Computing Information in Neuronal Spikes, Neural Processing Letters, 23:2, (183-199), Online publication date: 1-Apr-2006.
  55. Charoensak C and Sattar F (2005). Design of low-cost FPGA hardware for real-time ICA-based blind source separation algorithm, EURASIP Journal on Advances in Signal Processing, 2005, (3076-3086), Online publication date: 1-Jan-2005.
  56. ACM
    Le Saux B and Amato G Image recognition for digital libraries Proceedings of the 6th ACM SIGMM international workshop on Multimedia information retrieval, (91-98)
  57. Cohen I, Sebe N, Garg A, Chen L and Huang T (2003). Facial expression recognition from video sequences, Computer Vision and Image Understanding, 91:1-2, (160-187), Online publication date: 1-Jul-2003.
  58. ACM
    Yegneswaran V, Barford P and Ullrich J Internet intrusions Proceedings of the 2003 ACM SIGMETRICS international conference on Measurement and modeling of computer systems, (138-147)
  59. ACM
    Yegneswaran V, Barford P and Ullrich J (2003). Internet intrusions, ACM SIGMETRICS Performance Evaluation Review, 31:1, (138-147), Online publication date: 10-Jun-2003.
  60. Gokcay E and Principe J (2002). Information Theoretic Clustering, IEEE Transactions on Pattern Analysis and Machine Intelligence, 24:2, (158-171), Online publication date: 1-Feb-2002.
  61. Parker A, Gedeon T and Dimitrov A Annealing and the Rate Distortion problem Proceedings of the 16th International Conference on Neural Information Processing Systems, (993-976)
  62. ACM
    Barford P, Bestavros A, Byers J and Crovella M On the marginal utility of network topology measurements Proceedings of the 1st ACM SIGCOMM Workshop on Internet measurement, (5-17)
  63. Berger F, Van Bommel P and Van Der Weide T (1999). Ranking Strategies for Navigation Based Query Formulation, Journal of Intelligent Information Systems, 12:1, (5-25), Online publication date: 1-Apr-1999.
  64. ACM
    Raz R A parallel repetition theorem Proceedings of the twenty-seventh annual ACM symposium on Theory of computing, (447-456)
  65. ElMoslimany A and Duman T On the capacity of fading channels with amplitude-limited inputs 2016 IEEE International Symposium on Information Theory (ISIT), (1879-1883)
Contributors
  • Stanford University

Reviews

Vladik Ya. Kreinovich

Before Shannon, the word “information” was often used to express vague humanistic notions. Shannon was the first to give this notion a precise and intuitively clear definition: crudely speaking, information is the average number of binary (yes-no) questions that we have to ask in order to make our knowledge complete. For example, if we know only that an experiment can have <__?__Pub Fmt italic>n<__?__Pub Fmt /italic> possible outputs, then we must ask at least log 2 n binary questions in order to determine the actual output. So we need at least log 2 n bits of computer memory to store our knowledge about the actual output (we can store a binary representation of its number). Therefore we say that when someone tells us the actual result we gain log 2 n bits of information. The situation is somewhat different when in addition we know a priori the probabilities p 1 ,&ldots;,p n of possible outputs. If we have only one experiment, all <__?__Pub Fmt italic>n<__?__Pub Fmt /italic> outputs are possible, so we still need log 2 n binary questions to determine which of <__?__Pub Fmt italic>n<__?__Pub Fmt /italic> results occurred. If we have <__?__Pub Fmt italic>N<__?__Pub Fmt /italic> equivalent situations (with the same probabilities), however, not all n N combinations of <__?__Pub Fmt italic>N<__?__Pub Fmt /italic> outputs are possible: combinations are limited by the demand that the answer should be <__?__Pub Fmt italic>i<__?__Pub Fmt /italic> in approximately p i cases. Therefore the number <__?__Pub Fmt italic>Q<__?__Pub Fmt /italic>(<__?__Pub Fmt italic>N<__?__Pub Fmt /italic>) of binary questions that we need to ask in order to determine the outputs of all the experiments is generally smaller than log 2 n N =N log 2 n , so the average number of questions <__?__Pub Fmt italic>Q<__?__Pub Fmt /italic>(<__?__Pub Fmt italic>N<__?__Pub Fmt /italic>)/<__?__Pub Fmt italic>N<__?__Pub Fmt /italic> is smaller than log 2 n . Shannon has shown that when <__?__Pub Fmt italic>N<__?__Pub Fmt /italic> increases, this average number of questions tends to the value - p i log 2 p i (used in statistical physics under the name of entropy). This fundamental result gives an explicit formula for the amount of information when we have a sequence of independent identical events. In real life these events can be correlated, and their probabilities can change with time. How to compute the information then is a question for information theory. The answers are often related to mathematical entropy theory (initially developed for statistical physics). A related question is, If we have correlated random processes <__?__Pub Fmt italic>X<__?__Pub Fmt /italic> and <__?__Pub Fmt italic>Y<__?__Pub Fmt /italic>, and we know the results of <__?__Pub Fmt italic>Y<__?__Pub Fmt /italic>, how many binary questions do we have to ask (on average) in order to determine the results of <__?__Pub Fmt italic>X<__?__Pub Fmt /italic>__?__ (Because of correlation, we can often ask fewer questions than when we do not know the results of <__?__Pub Fmt italic>Y<__?__Pub Fmt /italic>.) Another class of questions is related to information transfer. If we have a channel whose output is distorted by noise, and the probabilities of different distortions are known, then the output contains less information than the input. Since the distortion is of some specific type, we can often decrease the information loss by properly encoding the input signal and correspondingly decoding the output signal (sometimes this is achieved at the expense of slowing down the information transfer, such as when we simply repeat every message twice). Natural questions are What minimal loss of information can we achieve for a given channel by using a proper encoding__?__ and What encoding should we use__?__ These questions also turn out to be closely connected to entropy theory. Gray's book gives a survey of the known results and provides complete proofs for most of them. The book is not easy reading: it is an advanced mathematical text aimed at those who feel comfortable with mathematical papers. For these readers, the book is a gift: it contains everything that was previously spread over papers, preprints, and conference proceedings. The prerequisites for this book are standard results on probability, random processes, and ergodic theory that you can also find under one cover in a previous book by the author [1]. From the applications viewpoint, one of the most promising tendencies in current information theory is the analysis of so-called robust codes (see Section 12.5).<__?__Pub Caret> Traditional information theory is based on an assumption that we know the precise probabilities of all possible distortions, while normally we have only estimates for them. So when we apply traditional methods to these estimates and get an optimal coding, the resulting behavior may be far from optimal in the real situation, where the probabilities are somewhat different. Methods that work well for all possible probabilities within a given precision range are called robust. From the computer science viewpoint certain areas are missing. First, the results presented in this book show only what can be achieved in principle by coding; coding algorithms are rarely given. Algorithms are promised, however, in the author's forthcoming book [2]. Second, Gray makes no mention of the Kolmogorov-Solomonoff algorithmic approach to probability. Conventional probability theory predicts the probabilities of different output sequences, but it does not explain what it means that a given sequence of experimental results is “random,” so it does not explain when a sequence of observations is consistent with an assumption about probabilities. Algorithmic theory describes the notion of a random sequence (or a random process) in algorithmic terms. This approach has been successfully applied to information theory. The third omission (related to robust methods) includes attempts to generalize information theory to the case when some knowledge is not statistical, but subjective (as in fuzzy theory). These additions would demand an increase in the book's size, but I would welcome some references at least. This book is not easy to read, with condensed proofs and no exercises. For a mathematically minded computer scientist, however, this challenge is rewarding.

Access critical reviews of Computing literature here

Become a reviewer for Computing Reviews.

Recommendations