Abstract
A method for estimating Shannon differential entropy is proposed based on the second order expansion of the probability mass around the inspection point with respect to the distance from the point. Polynomial regression with Poisson error structure is utilized to estimate the values of density function. The density estimates at every given data points are averaged to obtain entropy estimators. The proposed estimator is shown to perform well through numerical experiments for various probability distributions.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Cover, T.M., Thomas, J.A.: Elements of information theory. Wiley, Hoboken (1991)
Shannon, C.E.: A mathematical theory of communication. Bell Syst. Tech. J. 27(3), 379–423 (1948)
Wand, M.P., Jones, M.C.: Kernel Smoothing. Chapman & Hall/CRC, London (1994)
Joe, H.: Estimation of entropy and other functionals of a multivariate density. Ann. Inst. Stat. Math. 41(4), 683–697 (1989)
Kozachenko, L.F., Leonenko, N.N.: Sample estimate of entropy of a random vector. Prob. Inf. Transm 23, 95–101 (1987)
Goria, M.N., Leonenko, N.N., Mergel, V.V., Novi Inverardi, P.L.: A new class of random vector entropy estimators and its applications in testing statistical hypotheses. J. Nonparametric Stat. 17(3), 277–297 (2005)
Beirlant, J., Dudewicz, E.J., Györfi, L., Meulen, E.C.: Nonparametric entropy estimation: an overview. Int. J. Math. Stat. Sci. 6, 17–39 (1997)
Györfi, L., van der Meulen, E.C.: Density-free convergence properties of various estimators of entropy. Comput. Stat. Data Anal. 5(4), 425–436 (1987)
Paninski, L.: Estimation of entropy and mutual information. Neural Comput. 15, 1191–1253 (2003)
Pérez-Cruz, F.: Estimation of information theoretic measures for continuous random variables. In: NIPS, pp. 1257–1264 (2008)
Loftsgaarden, D.O., Quesenberry, C.P.: A nonparametric estimate of a multivariate density function. Ann. Math. Stat. 36(3), 1049–1051 (1965)
Mack, Y.P., Rosenblatt, M.: Multivariate k-nearest neighbor density estimates. J. Multivar. Anal. 9(1), 1–15 (1979)
Moore, D.S., Yackel, J.W.: Consistency properties of nearest neighbor density function estimators. Ann. Stat. 5(1), 143–154 (1977)
Hino, H., Koshijima, K., Murata, N.: Non-parametric entropy estimators based on simple linear regression. Comput. Stat. Data Anal. 89, 72–84 (2015)
Rudemo, M.: Empirical choice of histograms and kernel density estimators. Scand. J. Stat. 9(2), 65–78 (1982)
Khan, S., Bandyopadhyay, S., Ganguly, A.R., Saigal, S., Erickson, D.J., Protopopescu, V., Ostrouchov, G.: Relative performance of mutual information estimation methods for quantifying the dependence among short and noisy data. Phys. Rev. E: Stat., Nonlin., Soft Matter Phys. 76(2 Pt 2), 026209 (2007)
Acknowledgement
Part of this work was supported by JSPS KAKENHI No. 25120009, 25120011, and 16K16108.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2016 Springer International Publishing AG
About this paper
Cite this paper
Hino, H., Akaho, S., Murata, N. (2016). An Entropy Estimator Based on Polynomial Regression with Poisson Error Structure. In: Hirose, A., Ozawa, S., Doya, K., Ikeda, K., Lee, M., Liu, D. (eds) Neural Information Processing. ICONIP 2016. Lecture Notes in Computer Science(), vol 9948. Springer, Cham. https://doi.org/10.1007/978-3-319-46672-9_2
Download citation
DOI: https://doi.org/10.1007/978-3-319-46672-9_2
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-46671-2
Online ISBN: 978-3-319-46672-9
eBook Packages: Computer ScienceComputer Science (R0)