Abstract
In order to achieve good generalization with neural networks overfitting must be controlled. Weight penalty factors are one common method of providing this control. However, using weight penalties creates the additional search problem of finding the optimal penalty factors. MacKay [5] proposed an approximate Bayesian framework for training neural networks, in which penalty factors are treated as hyperparameters and found in an iterative search. However, for classification networks trained with cross-entropy error, this search is slow and unstable, and it is not obvious how to improve it. This paper describes and compares several strategies for controlling this search. Some of these strategies greatly improve the speed and stability of the search. Test runs on a range of tasks are described.
Previously published in: Orr, G.B. and Müller, K.-R. (Eds.): LNCS 1524, ISBN 978-3-540-65311-0 (1998).
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Bishop, C.: Neural Networks for Pattern Recognition. Oxford University Press (1995)
Buntine, W.L., Weigend, A.S.: Computing second derivatives in feed-forward networks: A review. IEEE Transactions on Neural Networks 5(3), 480–488 (1994)
Hastie, T.J., Tibshirani, R.J.: Generalized additive models. Chapman and Hall, London (1990)
Hwang, J.N., Lay, S.-R., Maechler, M., Martin, R.D., Schimert, J.: Regression modeling in back-propagation and projection pursuit learning. IEEE Transactions on Neural Networks 5(3), 342–353 (1994)
MacKay, D.J.C.: A practical Bayesian framework for backpropagation networks. Neural Computation 4(3), 448–472 (1992)
MacKay, D.J.C.: Bayesian methods for backpropagation networks. In: Domany, E., van Hemmen, J.L., Schulten, K. (eds.) Models of Neural Networks III, ch. 6, Springer, New York (1994)
MacKay, D.J.C.: Probable networks and plausible predictions - a review of practical Bayesian methods for supervised neural networks. Network: Computation in Neural Systems 6, 469–505 (1995)
MacKay, D.J.C.: Bayesian non-linear modelling for the 1993 energy prediction competition. In: Heidbreder, G. (ed.) Maximum Entropy and Bayesian Methods, Santa Barbara 1993, pp. 221–234. Kluwer, Dordrecht (1996)
Neal, R.M.: Monte carlo implementation of gaussian process models for bayesian regression and classification. Technical Report TR9702, Dept. of Statistics, University of Toronto (1997), Software, http://www.cs.utoronto.ca/~radford/
Neal, R.M.: Bayesian Learning for Neural Networks. Springer, New York (1996)
Plate, T., Bert, J., Grace, J., Band, P.: A comparison between neural networks and other statistical techniques for modeling the relationship between tobacco and alcohol and cancer. In: Mozer, M.C., Jordan, M.I., Petsche, T. (eds.) Advances in Neural Information Processing 9 (NIPS 1996). MIT Press (1997)
Roosen, C., Hastie, T.: Logistic response projection pursuit. Technical report, AT&T Bell Laboratories (1993)
Thodberg, H.H.: A review of bayesian neural networks with an application to near infrared spectroscopy. IEEE Transactions on Neural Networks 7(1), 56–72 (1996)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2012 Springer-Verlag Berlin Heidelberg
About this chapter
Cite this chapter
Plate, T. (2012). Controlling the Hyperparameter Search in MacKay’s Bayesian Neural Network Framework. In: Montavon, G., Orr, G.B., Müller, KR. (eds) Neural Networks: Tricks of the Trade. Lecture Notes in Computer Science, vol 7700. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-35289-8_7
Download citation
DOI: https://doi.org/10.1007/978-3-642-35289-8_7
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-35288-1
Online ISBN: 978-3-642-35289-8
eBook Packages: Computer ScienceComputer Science (R0)