Bayesian Calibration of Computer Models with Informative Failures

06/13/2020
by   Peter W. Marcy, et al.
0

There are many practical difficulties in the calibration of computer models to experimental data. One such complication is the fact that certain combinations of the calibration inputs can cause the code to output data lacking fundamental properties, or even to produce no output at all. In many cases the researchers want or need to exclude the possibility of these "failures" within their analyses. We propose a Bayesian (meta-)model in which the posterior distribution for the calibration parameters naturally excludes regions of the input space corresponding to failed runs. That is, we define a statistical selection model to rigorously couple the disjoint problems of binary classification and computer model calibration. We demonstrate our methodology using data from a carbon capture experiment in which the numerics of the computational fluid dynamics are prone to instability.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset