Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
Skip to main content

Non-linear Inequalities between Predictive and Kolmogorov Complexities

  • Conference paper
  • First Online:
Algorithmic Learning Theory (ALT 2001)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 2225))

Included in the following conference series:

Abstract

Predictive complexity is a generalization of Kolmogorov complexity. It corresponds to an “optimal” prediction strategy which gives a lower bound to ability of any algorithm to predict elements of a sequence of outcomes. A variety of types of loss functions makes it interesting to study relations between corresponding predictive complexities. Non-linear inequalities (with variable coefficients) between predictive complexity KG(x) of non-logarithmic type and Kolmogorov complexity K(x) (which is close to predictive complexity for logarithmic loss function) are the main subject of consideration in this paper. We deduce from these inequalities an asymptotic relation supx:l(x)=nK(x)/KG(x) ∼ 1/a log n, when n → ∞, where a is a constant and l(x) is the length of a sequence x. An analogous asymptotic result holds for relative complexities K(x)/l(x) and KG(x)/l(x). To obtain these inequalities we present estimates of the cardinality of all sequences of given predictive complexity.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Similar content being viewed by others

References

  1. Haussler, D., Kivinen, J., Warmuth, M.K.: Tight worst-case loss bounds for predicting with expert advice. Technical Report UCSC-CRL-94-36, University of California at Santa Cruz, revised December 1994. Short version in P. Vitányi, editor, Computational Learning Theory, Lecture Notes in Computer Science 904 (1995) 69-83 Springer Berlin

    Google Scholar 

  2. Cesa-Bianchi, N., Freund, Y., Helmbold, D.P., Haussler, D., Schapire, R.E., Warmuth, M.K.: How to use expert advice Journal of the ACM 44 (1997) 427–485

    Google Scholar 

  3. Cormen, H., Leiserson, E., Rivest, R.: Introduction to Algorithms New York: McGraw Hill 1990

    MATH  Google Scholar 

  4. Kalnishkan, Y.: General linear relations among different types of predictive complexity In Proc. 10th international Conference on Algorithmic Learning Theory-ALT’ 99 1720 (1999) of Lecture Notes in Artificial Intelligence 323–334 Springer-Verlag

    Google Scholar 

  5. Li, M., Vitányi, P.: An Introduction to Kolmogorov Complexity and Its Applications Springer New York 2nd edition 1997

    Book  Google Scholar 

  6. Rogers, H.: Theory of recursive functions and effective computability New York: McGraw Hill 1967

    MATH  Google Scholar 

  7. Vovk, V.: Aggregating strategies In M. Fulk and J. Case, editors Proceedings of the 3rd Annual Workshop on Computational Learning Theory 371–383 San Mateo CA Morgan Kaufmann 1990

    Google Scholar 

  8. Vovk, V.: A game of prediction with expert advice J. Comput. Syst. Sci. 56 (1998) 153–173

    Article  MathSciNet  Google Scholar 

  9. Vovk, V., Gammerman, A.: Complexity estimation principle The Computer Journal 42 (1999) 318–322

    Article  Google Scholar 

  10. Vovk, V., Watkins, C.J.H.C.: Universal portfolio selection Proceedings of the 11th Annual Conference on Computational Learning Theory (1998) 12–23

    Google Scholar 

  11. Zvonkin, A.K., Levin, L.A.: The complexity of finite objects and the algorithmic concepts of information and randomness Russ. Math. Surv. 25 (1970) 83–124

    Article  Google Scholar 

  12. Yamanishi, K.: Randomized approximate aggregating strategies and their applications to prediction and discrimination, in Proceedings 8th Annual ACM Conference on Computational Learning Theory 83–90 Assoc. Comput. Mach. New York 1995

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2001 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Vyugin, M.V., V'yugin, V.V. (2001). Non-linear Inequalities between Predictive and Kolmogorov Complexities. In: Abe, N., Khardon, R., Zeugmann, T. (eds) Algorithmic Learning Theory. ALT 2001. Lecture Notes in Computer Science(), vol 2225. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-45583-3_16

Download citation

  • DOI: https://doi.org/10.1007/3-540-45583-3_16

  • Published:

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-42875-6

  • Online ISBN: 978-3-540-45583-7

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics