Abstract
Designs of micro electro-mechanical devices need to be robust against fluctuations in mass production. Computer experiments with tens of parameters are used to explore the behavior of the system, and to compute sensitivity measures as expectations over the input distribution. Monte Carlo methods are a simple approach to estimate these integrals, but they are infeasible when the models are computationally expensive. Using a Gaussian processes prior, expensive simulation runs can be saved. This Bayesian quadrature allows for an active selection of inputs where the simulation promises to be most valuable, and the number of simulation runs can be reduced further.
We present an active learning scheme for sensitivity analysis which is rigorously derived from the corresponding Bayesian expected loss. On three fully featured, high dimensional physical models of electro-mechanical sensors, we show that the learning rate in the active learning scheme is significantly better than for passive learning.
Chapter PDF
Similar content being viewed by others
Keywords
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.
References
Saltelli, A., Chan, K., Scott, E.M.: Sensitivity Analysis. Wiley, Chichester (2000)
Pfingsten, T., Herrmann, D.J., Rasmussen, C.E.: Model-based design analysis and optimization. IEEE Trans. on Semiconductor Manufacturing (in revision) (2006)
O’Hagan, A.: Bayes-Hermite Quadrature. Journal of Statistical Planning and Inference 29(3), 245–260 (1991)
Oakley, J.E., O’Hagan, A.: Probabilistic sensitivity analysis of complex models: a Bayesian approach. Journal of the Royal Statistical Society, Series B 66(3), 751–769 (2004)
Rasmussen, C.E., Ghahramani, Z.: Bayesian Monte Carlo. In: NIPS, vol. 15 (2003)
Lindley, D.V.: On the measure of information provided by an experiment. Ann. math. Statist. 27, 986–1005 (1956)
Berger, J.O.: Statistical Decision Theory and Bayesian Analysis. Springer, New York (1985)
Chaloner, K., Verdinelli, I.: Bayesian experimental design: A review. Statistical Science 10(3), 273–304 (1995)
Lindley, D.V.: Bayesian Statistics — A Review. SIAM, Philadelphia (1972)
Mackay, D.: Information-based objective functions for active data selection. Neural Computation 4(4), 589–603 (1992)
O’Hagan, A.: Curve Fitting and Optimal Design for Prediction. J.R. Statist. Soc. B 40(1), 1–42 (1978)
Ko, C.W., Lee, J., Queyranne, M.: An exact algorithm for maximum entropy sampling. Operations Research 43(4), 684–691 (1995)
Lindley, D.: The choice of variables in multiple regression. Journal of the Royal Statistical Society B 30(1), 31–66 (1968)
Niederreiter, H.: Random number generation and quasi-Monte Carlo methods. SIAM, Philadelphia (1992)
O’Hagan, A.: Monte Carlo is Fundamentally Unsound. The Statistician 36(2/3), 247–249 (1987)
Rasmussen, C.E., Williams, C.K.I.: Gaussian Processes for Machine Learning. MIT Press, Cambridge (2006)
Sacks, J., Welch, W.J., Mitchell, T.J., Wynn, H.P.: Design and analysis of computer experiments. Statistical Science 4(4), 409–423 (1989)
Welch, W.J., Buck, R.J., Sacks, J.S., Wynn, H.P., Mitchell, T.J., Morris, M.D.: Screening, prediction, and computer experiments. Technometrics 34(1), 15–25 (1992)
Santner, T.J., Williams, B.J., Notz, W.I.: The Design and Analysis of Computer Experiments. Springer, New York (2003)
Mackay, M.D., Beckmann, R.J., Conover, W.J.: A comparison of three methods for selecting values of input variables in the analysis of output from a computer code. Technometrics 21(2), 239–245 (1979)
Johnson, M.E., Ylvisaker, D., Moore, L.: Minimax and maximin distance designs. J. of Statistical Planning and Inference 26, 131–148 (1990)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2006 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Pfingsten, T. (2006). Bayesian Active Learning for Sensitivity Analysis. In: Fürnkranz, J., Scheffer, T., Spiliopoulou, M. (eds) Machine Learning: ECML 2006. ECML 2006. Lecture Notes in Computer Science(), vol 4212. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11871842_35
Download citation
DOI: https://doi.org/10.1007/11871842_35
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-45375-8
Online ISBN: 978-3-540-46056-5
eBook Packages: Computer ScienceComputer Science (R0)