Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.5555/2074094.2074112guideproceedingsArticle/Chapter ViewAbstractPublication PagesConference Proceedingsacm-pubtype
Article
Free access

Learning by transduction

Published: 24 July 1998 Publication History
  • Get Citation Alerts
  • Abstract

    We describe a method for predicting a classification of an object given classifications of the objects in the training set, assuming that the pairs object/classification are generated by an i.i.d. process from a continuous probability distribution. Our method is a modification of Vapnik's support-vector machine; its main novelty is that it gives not only the prediction itself but also a practicable measure of the evidence found in support of that prediction. We also describe a procedure for assigning degrees of confidence to predictions made by the support vector machine. Some experimental results are presented, and possible extensions of the algorithms are discussed.

    References

    [1]
    C. Cortes and V. Vapnik. Support-vector networks. Machine Learning 20:1-25, 1995.
    [2]
    A. P. Dawid. Inference, statistical: I. In Encyclopedia of Statistical Sciences (S. Kotz and N. L. Johnson, Editors). Wiley, New York, 1983, vol. 4, pp. 89-105.
    [3]
    D. A. S. Fraser. Sequentially determined statistically equivalent blocks. Ann. Math. Statist. 22:372, 1951.
    [4]
    A. Gammerman. Machine learning: progress and prospects. Technical Report CSD-TR-96-21, Department of Computer Science, Royal Holloway, University of London, December 1996.
    [5]
    A. Gammerman and A. R. Thatcher. Bayesian diagnostic probabilities without assuming independence of symptoms. Yearbook of Medical Informatics, 1992, pp. 323-330.
    [6]
    Y. LeCun, B. Boser, J. S. Denker, D. Henderson, R. E. Howard, W. Hubbard, and L. J. Jackel. Handwritten digit recognition with backpropagation network. Advances in Neural Information Processing Systems 2. Morgan Kaufmann, 1990, pp. 396-404.
    [7]
    V. N. Vapnik. The Nature of Statistical Learning Theory. Springer, New York, 1995.
    [8]
    V. G. Vovk. On the concept of the Bernoulli property. Russ. Math. Surv. 41:247-248, 1986.
    [9]
    V. G. Vovk. A logic of probability, with application to the foundations of statistics (with discussion). J. R. Statist. Soc. B 55:317-351, 1993.
    [10]
    V. G. Vovk and V. V. V'yugin. On the empirical validity of the Bayesian method. J. R. Statist. Soc. B 55:253-266, 1993.

    Cited By

    View all
    • (2023)Learning functional transductionProceedings of the 37th International Conference on Neural Information Processing Systems10.5555/3666122.3669352(73852-73865)Online publication date: 10-Dec-2023
    • (2022)PAC prediction sets for meta-learningProceedings of the 36th International Conference on Neural Information Processing Systems10.5555/3600270.3603018(37920-37931)Online publication date: 28-Nov-2022
    • (2021)Lost in Transduction: Transductive Transfer Learning in Text ClassificationACM Transactions on Knowledge Discovery from Data10.1145/345314616:1(1-21)Online publication date: 20-Jul-2021
    • Show More Cited By

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image Guide Proceedings
    UAI'98: Proceedings of the Fourteenth conference on Uncertainty in artificial intelligence
    July 1998
    538 pages
    ISBN:155860555X

    Sponsors

    • NEC
    • HUGIN: Hugin Expert A/S
    • Information Extraction and Transportation
    • Microsoft Research: Microsoft Research
    • AT&T: AT&T Labs Research

    Publisher

    Morgan Kaufmann Publishers Inc.

    San Francisco, CA, United States

    Publication History

    Published: 24 July 1998

    Qualifiers

    • Article

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)450
    • Downloads (Last 6 weeks)43

    Other Metrics

    Citations

    Cited By

    View all
    • (2023)Learning functional transductionProceedings of the 37th International Conference on Neural Information Processing Systems10.5555/3666122.3669352(73852-73865)Online publication date: 10-Dec-2023
    • (2022)PAC prediction sets for meta-learningProceedings of the 36th International Conference on Neural Information Processing Systems10.5555/3600270.3603018(37920-37931)Online publication date: 28-Nov-2022
    • (2021)Lost in Transduction: Transductive Transfer Learning in Text ClassificationACM Transactions on Knowledge Discovery from Data10.1145/345314616:1(1-21)Online publication date: 20-Jul-2021
    • (2020)Test-time training with self-supervision for generalization under distribution shiftsProceedings of the 37th International Conference on Machine Learning10.5555/3524938.3525794(9229-9248)Online publication date: 13-Jul-2020
    • (2020)Negative sampling in semi-supervised learningProceedings of the 37th International Conference on Machine Learning10.5555/3524938.3525097(1704-1714)Online publication date: 13-Jul-2020
    • (2019)MixMatchProceedings of the 33rd International Conference on Neural Information Processing Systems10.5555/3454287.3454741(5049-5059)Online publication date: 8-Dec-2019
    • (2019)Multi-Modal Curriculum Learning over GraphsACM Transactions on Intelligent Systems and Technology10.1145/332212210:4(1-25)Online publication date: 20-Jul-2019
    • (2019)Synthesis and machine learning for heterogeneous extractionProceedings of the 40th ACM SIGPLAN Conference on Programming Language Design and Implementation10.1145/3314221.3322485(301-315)Online publication date: 8-Jun-2019
    • (2019)Active inductive logic programming for code searchProceedings of the 41st International Conference on Software Engineering10.1109/ICSE.2019.00044(292-303)Online publication date: 25-May-2019
    • (2019)Efficient Venn predictors using random forestsMachine Language10.1007/s10994-018-5753-x108:3(535-550)Online publication date: 1-Mar-2019
    • Show More Cited By

    View Options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Get Access

    Login options

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media