Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
Skip to main content

Information-Theoretic Lower Bounds to Error Probability for the Models of Noisy Discrete Source Coding and Object Classification

  • SELECTED PAPERS OF THE 8th INTERNATIONAL WORKSHOP “IMAGE MINING. THEORY AND APPLICATIONS”
  • Published:
Pattern Recognition and Image Analysis Aims and scope Submit manuscript

Abstract

Probabilistic models of noisy discrete source coding and object classification are studied. For these models, the appropriate minimal information amounts as the functions of a given admissible error probability are defined and the strictly decreasing lower bounds to these functions are constructed. The defined functions are similar to the rate-distortion function known in the information theory and the lower bounds to the these functions yield a minimal error probability subject to a given value of the processed information amount. So, the obtained bounds are the bifactor fidelity criterions in source coding and object classification tasks.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1.
Fig. 2.
Fig. 3.

Similar content being viewed by others

REFERENCES

  1. Distance matrices for face dataset (2020). http://sourceforge.net/projects/distance-matrices-face.

  2. Distance matrices for signature dataset (2020). http://sourceforge.net/projects/distance-matrices-signature.

  3. R. L. Dobrushin and B. S. Tsybakov, “Information transmission with additional noise,” IRE Trans. Inf. Theory 8, 293–304 (1962). https://doi.org/10.1109/TIT.1962.1057738

    Article  MATH  Google Scholar 

  4. R. O. Duda, P. E. Hart, and D. G. Stork, Pattern Classification, 2nd ed. (Wiley & Sons, New York, 2001).

    MATH  Google Scholar 

  5. R. G. Gallager, Information Theory and Reliable Communication (Wiley & Sons, New York, 1968).

    MATH  Google Scholar 

  6. L. I. Kuncheva, C. J. Whitaker, C. A. Shipp, and R. P. W. Duin, “Limits on the majority vote accuracy in classifier fusion,” Pattern Anal. Appl. 6, 22–31 (2003). https://doi.org/10.1007/s10044-002-0173-7

    Article  MathSciNet  MATH  Google Scholar 

  7. L. Lam and C. Y. Suen, “Application of majority voting to pattern recognition: An analysis of its behavior and performance,” IEEE Trans. Syst., Man, Cybern. 27, 553–568 (1997). https://doi.org/10.1109/3468.618255

    Article  Google Scholar 

  8. M. M. Lange and S. N. Ganebnykh, ”On fusion schemes for multiclass classification with reject in a given ensemble of sources,” J. Phys.: Conf. Ser. 1096, 012048 (2018). https://doi.org/10.1088/1742-6596/1096/1/012048

    Article  Google Scholar 

  9. H. T. Sueno, B. D. Gerardo, and R. P. Medina, “Multi-class document classification using support vector machine (SVM) based on improved naïve Bayes vectorization technique,” Int. J. Adv. Trends Comput. Sci. Eng. 9, 3937–3944 (2020). https://doi.org/10.30534/ijatcse/2020/216932020

    Article  Google Scholar 

  10. X. Xu, S. L. Huang, L. Zheng, and G. W. Wornell, “An information theoretic interpretation to deep neural networks,” Entropy 24, 135 (2022). https://doi.org/10.3390/e24010135

    Article  MathSciNet  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding authors

Correspondence to M. M. Lange or A. M. Lange.

Ethics declarations

COMPLIANCE WITH ETHICAL STANDARDS

This article is a completely original work of its authors; it has not been published before and will not be sent to other publications until the PRIA Editorial Board decides not to accept it for publication.

Conflict of Interest

The authors declare that they have no conflicts of interest.

Additional information

Mikhail M. Lange, born in 1945, graduated from the Moscow State Power Institute (1968), Cand. Sci. in information theory and technical cybernetics (1981), Leading Researcher in the Federal Scientific Center “Computer Science and Control” of the Russian Academy of Sciences, author of 137 scientific papers.

Andrey M. Lange, born in 1979, graduated from the Bauman Moscow State Technical University (2002), Cand. Sci. in mathematical modeling and numerical methods (2008), Research Assistant in the Federal Scientific Center “Computer Science and Control” of the Russian Academy of Sciences, author of 30 scientific papers.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Lange, M.M., Lange, A.M. Information-Theoretic Lower Bounds to Error Probability for the Models of Noisy Discrete Source Coding and Object Classification. Pattern Recognit. Image Anal. 32, 570–574 (2022). https://doi.org/10.1134/S105466182203021X

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1134/S105466182203021X

Keywords: