Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/2020408.2020421acmconferencesArticle/Chapter ViewAbstractPublication PageskddConference Proceedingsconference-collections
research-article

An improved GLMNET for l1-regularized logistic regression

Published: 21 August 2011 Publication History

Abstract

GLMNET proposed by Friedman et al. is an algorithm for generalized linear models with elastic net. It has been widely applied to solve L1-regularized logistic regression. However, recent experiments indicated that the existing GLMNET implementation may not be stable for large-scale problems. In this paper, we propose an improved GLMNET to address some theoretical and implementation issues. In particular, as a Newton-type method, GLMNET achieves fast local convergence, but may fail to quickly obtain a useful solution. By a careful design to adjust the effort for each iteration, our method is efficient regardless of loosely or strictly solving the optimization problem. Experiments demonstrate that the improved GLMNET is more efficient than a state-of-the-art coordinate descent method.

References

[1]
J. Friedman, T. Hastie, and R. Tibshirani, "Regularization paths for generalized linear models via coordinate descent," Journal of Statistical Software, vol. 33, no. 1, pp. 1--22, 2010.
[2]
A. Genkin, D. D. Lewis, and D. Madigan, "Large-scale Bayesian logistic regression for text categorization," Technometrics, vol. 49, no. 3, pp. 291--304, 2007.
[3]
K. Koh, S.-J. Kim, and S. Boyd, "An interior-point method for large-scale l1-regularized logistic regression," Journal of Machine Learning Research, vol. 8, pp. 1519--1555, 2007.
[4]
G. Andrew and J. Gao, "Scalable training of L1-regularized log-linear models," in Proceedings of the Twenty Fourth International Conference on Machine Learning (ICML), 2007.
[5]
J. Liu, J. Chen, and J. Ye, "Large-scale sparse logistic regression," in Proceedings of The 15th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, pp. 547--556, 2009.
[6]
M. Schmidt, G. Fung, and R. Rosales, "Fast optimization methods for l1 regularization: A comparative study and two new approaches," in Proceedings of European Conference on Machine Learning, pp. 286--297, 2007.
[7]
G.-X. Yuan, K.-W. Chang, C.-J. Hsieh, and C.-J. Lin, "A comparison of optimization methods and software for large-scale l1-regularized linear classification," Journal of Machine Learning Research, vol. 11, pp. 3183--3234, 2010.
[8]
P. Tseng and S. Yun, "A coordinate gradient descent method for nonsmooth separable minimization," Mathematical Programming, vol. 117, pp. 387--423, 2009.
[9]
S. Yun and K.-C. Toh, "A coordinate gradient descent method for l1-regularized convex minimization," Computational Optimizations and Applications, vol. 48, no. 2, pp. 273--307, 2011.
[10]
K.-W. Chang, C.-J. Hsieh, and C.-J. Lin, "Coordinate descent method for large-scale L2-loss linear SVM," Journal of Machine Learning Research, vol. 9, pp. 1369--1398, 2008.
[11]
G.-X. Yuan, C.-H. Ho, and C.-J. Lin, "An improved GLMNET for l1-regularized logistic regression and support vector machines," tech. rep., National Taiwan University, 2011.
[12]
H.-F. Yu, H.-Y. Lo, H.-P. Hsieh, J.-K. Lou, T. G. McKenzie, J.-W. Chou, P.-H. Chung, C.-H. Ho, C.-F. Chang, Y.-H. Wei, J.-Y. Weng, E.-S. Yan, C.-W. Chang, T.-T. Kuo, Y.-C. Lo, P. T. Chang, C. Po, C.-Y. Wang, Y.-H. Huang, C.-W. Hung, Y.-X. Ruan, Y.-S. Lin, S.-D. Lin, H.-T. Lin, and C.-J. Lin, "Feature engineering and classifier ensemble for KDD cup 2010," in JMLR Workshop and Conference Proceedings, 2011. To appear.
[13]
R.-E. Fan, K.-W. Chang, C.-J. Hsieh, X.-R. Wang, and C.-J. Lin, "LIBLINEAR: A library for large linear classification," Journal of Machine Learning Research, vol. 9, pp. 1871--1874, 2008.

Cited By

View all
  • (2024)High-Dimensional Distributed Sparse Classification with Scalable Communication-Efficient Global UpdatesProceedings of the 30th ACM SIGKDD Conference on Knowledge Discovery and Data Mining10.1145/3637528.3672038(2037-2047)Online publication date: 25-Aug-2024
  • (2024)Communication-Efficient Regret-Optimal Distributed Online Convex OptimizationIEEE Transactions on Parallel and Distributed Systems10.1109/TPDS.2024.340388335:11(2270-2283)Online publication date: Nov-2024
  • (2024)LF-Transformer: Latent Factorizer Transformer for Tabular LearningIEEE Access10.1109/ACCESS.2024.335497212(10690-10698)Online publication date: 2024
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
KDD '11: Proceedings of the 17th ACM SIGKDD international conference on Knowledge discovery and data mining
August 2011
1446 pages
ISBN:9781450308137
DOI:10.1145/2020408
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 21 August 2011

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. l1 regularization
  2. linear classification
  3. logistic regression

Qualifiers

  • Research-article

Conference

KDD '11
Sponsor:

Acceptance Rates

Overall Acceptance Rate 1,133 of 8,635 submissions, 13%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)72
  • Downloads (Last 6 weeks)5
Reflects downloads up to 06 Oct 2024

Other Metrics

Citations

Cited By

View all
  • (2024)High-Dimensional Distributed Sparse Classification with Scalable Communication-Efficient Global UpdatesProceedings of the 30th ACM SIGKDD Conference on Knowledge Discovery and Data Mining10.1145/3637528.3672038(2037-2047)Online publication date: 25-Aug-2024
  • (2024)Communication-Efficient Regret-Optimal Distributed Online Convex OptimizationIEEE Transactions on Parallel and Distributed Systems10.1109/TPDS.2024.340388335:11(2270-2283)Online publication date: Nov-2024
  • (2024)LF-Transformer: Latent Factorizer Transformer for Tabular LearningIEEE Access10.1109/ACCESS.2024.335497212(10690-10698)Online publication date: 2024
  • (2024)Policy Framework for Realizing Net-Zero Emission in Smart CitiesArchives of Computational Methods in Engineering10.1007/s11831-024-10131-5Online publication date: 30-Apr-2024
  • (2024)An inexact regularized proximal Newton method without line searchComputational Optimization and Applications10.1007/s10589-024-00600-9Online publication date: 16-Aug-2024
  • (2024) Global climate change‐driven impacts on the Asian distribution of Limassolla leafhoppers, with implications for biological and environmental conservation Ecology and Evolution10.1002/ece3.7000314:7Online publication date: 18-Jul-2024
  • (2023)A Smoothing Conjugate Gradient Algorithm for Sparse Logistic Regression ProblemsAdvances in Applied Mathematics10.12677/AAM.2023.12836412:08(3665-3683)Online publication date: 2023
  • (2023)SaaSyML: Software as a Service for Machine Learning On-board the OPS-SAT Spacecraft2023 IEEE Aerospace Conference10.1109/AERO55745.2023.10115531(1-9)Online publication date: 4-Mar-2023
  • (2023)Striving for Sparsity: On Exact and Approximate Solutions in Regularized Structural Equation ModelsStructural Equation Modeling: A Multidisciplinary Journal10.1080/10705511.2023.2189070(1-18)Online publication date: 11-May-2023
  • (2023)Diabetes detection based on machine learning and deep learning approachesMultimedia Tools and Applications10.1007/s11042-023-16407-583:8(24153-24185)Online publication date: 10-Aug-2023
  • Show More Cited By

View Options

Get Access

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media