Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/1102351.1102430acmotherconferencesArticle/Chapter ViewAbstractPublication PagesicmlConference Proceedingsconference-collections
Article

Predicting good probabilities with supervised learning

Published: 07 August 2005 Publication History

Abstract

We examine the relationship between the predictions made by different learning algorithms and true posterior probabilities. We show that maximum margin methods such as boosted trees and boosted stumps push probability mass away from 0 and 1 yielding a characteristic sigmoid shaped distortion in the predicted probabilities. Models such as Naive Bayes, which make unrealistic independence assumptions, push probabilities toward 0 and 1. Other models such as neural nets and bagged trees do not have these biases and predict well calibrated probabilities. We experiment with two ways of correcting the biased probabilities predicted by some learning methods: Platt Scaling and Isotonic Regression. We qualitatively examine what kinds of distortions these calibration methods are suitable for and quantitatively examine how much data they need to be effective. The empirical results show that after calibration boosted trees, random forests, and SVMs predict the best probabilities.

References

[1]
Ayer, M., Brunk, H., Ewing, G., Reid, W., & Silverman, E. (1955). An empirical distribution function for sampling with incomplete information. Annals of Mathematical Statistics, 5, 641--647.
[2]
Blake, C., & Merz, C. (1998). UCI repository of machine learning databases.
[3]
DeGroot, M., & Fienberg, S. (1982). The comparison and evaluation of forecasters. Statistician, 32, 12--22.
[4]
Gualtieri, A., Chettri, S. R., Cromp, R., & Johnson, L. (1999). Support vector machine classifiers as applied to aviris data. Proc. Eighth JPL Airborne Geoscience Workshop.
[5]
Niculescu-Mizil, A., & Caruana, R. (2005). Obtaining calibrated probabilities from boosting. Proc. 21th Conference on Uncertainty in Artificial Intelligence (UAI '05), AUAI Press.
[6]
Platt, J. (1999). Probabilistic outputs for support vector machines and comparison to regularized likelihood methods. Advances in Large Margin Classifiers (pp. 61--74).
[7]
Robertson, T., Wright, F., & Dykstra, R. (1988). Order restricted statistical inference. New York: John Wiley and Sons.
[8]
Zadrozny, B., & Elkan, C. (2001). Obtaining calibrated probability estimates from decision trees and naive bayesian classifiers. ICML (pp. 609--616).
[9]
Zadrozny, B., & Elkan, C. (2002). Transforming classifier scores into accurate multiclass probability estimates. KDD (pp. 694--699).

Cited By

View all
  • (2025)Improved cattle farm classification: leveraging machine learning and linked national datasetsFrontiers in Veterinary Science10.3389/fvets.2025.151717312Online publication date: 5-Feb-2025
  • (2025)A supervised model to identify wolf behavior from tri-axial accelerationAnimal Biotelemetry10.1186/s40317-025-00400-w13:1Online publication date: 29-Jan-2025
  • (2025)Development and validation of machine-learning models for predicting the risk of hypertriglyceridemia in critically ill patients receiving propofol sedation using retrospective data: a protocolBMJ Open10.1136/bmjopen-2024-09259415:1(e092594)Online publication date: 21-Jan-2025
  • Show More Cited By
  1. Predicting good probabilities with supervised learning

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Other conferences
    ICML '05: Proceedings of the 22nd international conference on Machine learning
    August 2005
    1113 pages
    ISBN:1595931805
    DOI:10.1145/1102351
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 07 August 2005

    Permissions

    Request permissions for this article.

    Check for updates

    Qualifiers

    • Article

    Acceptance Rates

    Overall Acceptance Rate 140 of 548 submissions, 26%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)656
    • Downloads (Last 6 weeks)59
    Reflects downloads up to 06 Feb 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2025)Improved cattle farm classification: leveraging machine learning and linked national datasetsFrontiers in Veterinary Science10.3389/fvets.2025.151717312Online publication date: 5-Feb-2025
    • (2025)A supervised model to identify wolf behavior from tri-axial accelerationAnimal Biotelemetry10.1186/s40317-025-00400-w13:1Online publication date: 29-Jan-2025
    • (2025)Development and validation of machine-learning models for predicting the risk of hypertriglyceridemia in critically ill patients receiving propofol sedation using retrospective data: a protocolBMJ Open10.1136/bmjopen-2024-09259415:1(e092594)Online publication date: 21-Jan-2025
    • (2025)Cooperative learning of Pl@ntNet's Artificial Intelligence algorithm: How does it work and how can we improve it?Methods in Ecology and Evolution10.1111/2041-210X.14486Online publication date: 4-Feb-2025
    • (2025)Learning-Based Stochastic Model Predictive Control for Autonomous Driving at Uncontrolled IntersectionsIEEE Transactions on Intelligent Transportation Systems10.1109/TITS.2024.351004126:2(1538-1546)Online publication date: Feb-2025
    • (2025)Improving Accuracy and Calibration of Deep Image Classifiers With Agreement-Driven Dynamic EnsembleIEEE Open Journal of the Computer Society10.1109/OJCS.2024.35199846(165-176)Online publication date: 2025
    • (2025)Mapping Susceptibility and Risk of Land Subsidence by Integrating InSAR and Hybrid Machine Learning Models: A Case Study in Xi'an, ChinaIEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing10.1109/JSTARS.2024.352299518(3625-3639)Online publication date: 2025
    • (2025)MLHOps: Machine Learning Health OperationsIEEE Access10.1109/ACCESS.2024.352127913(20374-20412)Online publication date: 2025
    • (2025)Analysis of uncertainty of neural fingerprint-based modelsFaraday Discussions10.1039/D4FD00095AOnline publication date: 2025
    • (2025)Integration of clinical, pathological, radiological, and transcriptomic data improves prediction for first-line immunotherapy outcome in metastatic non-small cell lung cancerNature Communications10.1038/s41467-025-55847-516:1Online publication date: 12-Jan-2025
    • Show More Cited By

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media