Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/1143844.1143962acmotherconferencesArticle/Chapter ViewAbstractPublication PagesicmlConference Proceedingsconference-collections
Article

Bayesian regression with input noise for high dimensional data

Published: 25 June 2006 Publication History

Abstract

This paper examines high dimensional regression with noise-contaminated input and output data. Goals of such learning problems include optimal prediction with noiseless query points and optimal system identification. As a first step, we focus on linear regression methods, since these can be easily cast into nonlinear learning problems with locally weighted learning approaches. Standard linear regression algorithms generate biased regression estimates if input noise is present and suffer numerically when the data contains redundancy and irrelevancy. Inspired by Factor Analysis Regression, we develop a variational Bayesian algorithm that is robust to ill-conditioned data, automatically detects relevant features, and identifies input and output noise -- all in a computationally efficient way. We demonstrate the effectiveness of our techniques on synthetic data and on a system identification task for a rigid body dynamics model of a robotic vision head. Our algorithm performs 10 to 70% better than previously suggested methods.

References

[1]
An, C. H., Atkeson, C. G., & Hollerbach, J. M. (1988). Model-based control of a robot manipulator. MIT Press.
[2]
Atkeson, C. G., Moore, A., & Schaal, S. (1997). Locally weighted learning. In Artificial intelligence review, vol. 11, 11--73. Kluwer.
[3]
Dempster, A., Laird, N., & Rubin, D. (1977). Maximum likelihood from incomplete data via the em algorithm. Journal of Royal Statistical Society. Series B, 39, 1--38.
[4]
Derksen, S., & Keselman, H. (1992). Backward, forward and stepwise automated subset selection algorithms: Frequency of obtaining authentic and noise variables. British Journal of Mathematical and Statistical Psychology, 45, 265--282.
[5]
Draper, N. R., & Smith, H. (1981). Applied regression analysis. Wiley.
[6]
D'Souza, A., Vijayakumar, S., & Schaal, S. (2004). The bayesian backfitting relevance vector machine. In Proceedings of the 21st international conference on machine learning. ACM Press.
[7]
Ghahramani, Z., & Beal, M. (2000). Graphical models and variational methods. In D. Saad and M. Opper (Eds.), Advanced mean field methods - theory and practice. MIT Press.
[8]
Golub, G. H., & Van Loan, C. (198). Matrix computations. John Hopkins University Press.
[9]
Hastie, T. J., & Tibshirani, R. J. (1990). Generalized additive models. No. 43 in Monographs on Statistics and Applied Probability. Chapman and Hall.
[10]
Hollerbach, J. M., & Wampler, C. W. (1996). The calibration index and the role of input noise in robot calibration. In G. Giralt and G. Hirzinger (Eds.), Robotics research: The seventh international symposium, 558--568. Springer.
[11]
Massey, W. (1965). Principal component regression in exploratory statistical research. Journal of the American Statistical Association, 60, 234--246.
[12]
Neal, R. (1994). Bayesian learning for neural networks. Doctoral dissertation, Dept. of Computer Science, University of Toronto.
[13]
Rao, Y. N., & Principe, J. (2002). Efficient total least squares method for system modeling using minor component analysis. In Proceedings of international workshop on neural networks for signal processing, 259--268. IEEE.
[14]
Strassen, V. (1969). Gaussian elimination is not optimal. Num Mathematik, 13, 354--356.
[15]
Tibshirani, R. (1996). Regression shrinkage and selection via the lasso. Journal of Royal Statistical Society, Series B, 58, 267--288.
[16]
Van Huffel, S., & Vanderwalle, J. (1991). The total least squares problem: Computational aspects and analysis. Society for Industrial and Applied Mathematics.
[17]
Wold, H. (1975). Soft modeling by latent variables: The nonlinear iterative partial least squares approach. In J. Gani (Ed.), Perspectives in probability and statistics, papers in honor of s. m. bartlett. Academic Press.

Cited By

View all
  • (2020)Hybrid Gaussian Process Inference Model for Construction Management Decision MakingInternational Journal of Information Technology & Decision Making10.1142/S021962202050021219:04(1015-1036)Online publication date: 17-Jul-2020
  • (2013)A General Self-Adaptive Reputation System Based on the Kalman FeedbackProceedings of the 2013 International Conference on Service Sciences10.1109/ICSS.2013.28(7-12)Online publication date: 11-Apr-2013
  • (2013)Trust Model for Cloud Systems with Self Variance EvaluationSecurity, Privacy and Trust in Cloud Systems10.1007/978-3-642-38586-5_10(283-309)Online publication date: 4-Sep-2013
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Other conferences
ICML '06: Proceedings of the 23rd international conference on Machine learning
June 2006
1154 pages
ISBN:1595933832
DOI:10.1145/1143844
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 25 June 2006

Permissions

Request permissions for this article.

Check for updates

Qualifiers

  • Article

Acceptance Rates

ICML '06 Paper Acceptance Rate 140 of 548 submissions, 26%;
Overall Acceptance Rate 140 of 548 submissions, 26%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)20
  • Downloads (Last 6 weeks)1
Reflects downloads up to 10 Nov 2024

Other Metrics

Citations

Cited By

View all
  • (2020)Hybrid Gaussian Process Inference Model for Construction Management Decision MakingInternational Journal of Information Technology & Decision Making10.1142/S021962202050021219:04(1015-1036)Online publication date: 17-Jul-2020
  • (2013)A General Self-Adaptive Reputation System Based on the Kalman FeedbackProceedings of the 2013 International Conference on Service Sciences10.1109/ICSS.2013.28(7-12)Online publication date: 11-Apr-2013
  • (2013)Trust Model for Cloud Systems with Self Variance EvaluationSecurity, Privacy and Trust in Cloud Systems10.1007/978-3-642-38586-5_10(283-309)Online publication date: 4-Sep-2013
  • (2012)RLMIEEE Transactions on Services Computing10.1109/TSC.2010.565:1(131-143)Online publication date: 1-Jan-2012
  • (2012)Uncertainty Analysis of Neural-Network-Based Aerosol RetrievalIEEE Transactions on Geoscience and Remote Sensing10.1109/TGRS.2011.216612050:2(409-414)Online publication date: Feb-2012
  • (2010)Dimensionality Estimation, Manifold Learning and Function Approximation using Tensor VotingThe Journal of Machine Learning Research10.5555/1756006.175601811(411-450)Online publication date: 1-Mar-2010
  • (2010)Efficient learning and feature selection in high-dimensional regressionNeural Computation10.1162/neco.2009.02-08-70222:4(831-886)Online publication date: 1-Apr-2010
  • (2009)Privacy-Preserving Data PublishingFoundations and Trends in Databases10.1561/19000000082:1–2(1-167)Online publication date: 1-Jan-2009
  • (2009)A reputation inference model based on linear hidden markov process2009 ISECS International Colloquium on Computing, Communication, Control, and Management10.1109/CCCM.2009.5270424(354-357)Online publication date: Aug-2009
  • (2007)A Kalman filter for robust outlier detection2007 IEEE/RSJ International Conference on Intelligent Robots and Systems10.1109/IROS.2007.4399158(1514-1519)Online publication date: Oct-2007
  • Show More Cited By

View Options

Get Access

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media