Home > Research > Publications & Outputs > Comments on ‘Bayesian calibration of mathematic...
View graph of relations

Comments on ‘Bayesian calibration of mathematical models’ by M. C. Kennedy and A. O’Hagan.

Research output: Contribution to journalJournal articlepeer-review

Published

Standard

Comments on ‘Bayesian calibration of mathematical models’ by M. C. Kennedy and A. O’Hagan. / Wynn, H. P.; Brown, P. J.; Anderson, C.; Rougier, J. C.; Diggle, Peter J.; Goldstein, M.; Kendall, W. S.; Craig, P.

In: Journal of the Royal Statistical Society: Series B (Statistical Methodology), Vol. 63, No. 3, 2001, p. 450-464.

Research output: Contribution to journalJournal articlepeer-review

Harvard

Wynn, HP, Brown, PJ, Anderson, C, Rougier, JC, Diggle, PJ, Goldstein, M, Kendall, WS & Craig, P 2001, 'Comments on ‘Bayesian calibration of mathematical models’ by M. C. Kennedy and A. O’Hagan.', Journal of the Royal Statistical Society: Series B (Statistical Methodology), vol. 63, no. 3, pp. 450-464. https://doi.org/10.1111/1467-9868.00294

APA

Wynn, H. P., Brown, P. J., Anderson, C., Rougier, J. C., Diggle, P. J., Goldstein, M., Kendall, W. S., & Craig, P. (2001). Comments on ‘Bayesian calibration of mathematical models’ by M. C. Kennedy and A. O’Hagan. Journal of the Royal Statistical Society: Series B (Statistical Methodology), 63(3), 450-464. https://doi.org/10.1111/1467-9868.00294

Vancouver

Wynn HP, Brown PJ, Anderson C, Rougier JC, Diggle PJ, Goldstein M et al. Comments on ‘Bayesian calibration of mathematical models’ by M. C. Kennedy and A. O’Hagan. Journal of the Royal Statistical Society: Series B (Statistical Methodology). 2001;63(3):450-464. https://doi.org/10.1111/1467-9868.00294

Author

Wynn, H. P. ; Brown, P. J. ; Anderson, C. ; Rougier, J. C. ; Diggle, Peter J. ; Goldstein, M. ; Kendall, W. S. ; Craig, P. / Comments on ‘Bayesian calibration of mathematical models’ by M. C. Kennedy and A. O’Hagan. In: Journal of the Royal Statistical Society: Series B (Statistical Methodology). 2001 ; Vol. 63, No. 3. pp. 450-464.

Bibtex

@article{f27db69342414079af0aa2467f41830d,
title = "Comments on {\textquoteleft}Bayesian calibration of mathematical models{\textquoteright} by M. C. Kennedy and A. O{\textquoteright}Hagan.",
abstract = "We consider prediction and uncertainty analysis for systems which are approximated using complex mathematical models. Such models, implemented as computer codes, are often generic in the sense that by a suitable choice of some of the model's input parameters the code can be used to predict the behaviour of the system in a variety of specific applications. However, in any specific application the values of necessary parameters may be unknown. In this case, physical observations of the system in the specific context are used to learn about the unknown parameters. The process of fitting the model to the observed data by adjusting the parameters is known as calibration. Calibration is typically effected by ad hoc fitting, and after calibration the model is used, with the fitted input values, to predict the future behaviour of the system. We present a Bayesian calibration technique which improves on this traditional approach in two respects. First, the predictions allow for all sources of uncertainty, including the remaining uncertainty over the fitted parameters. Second, they attempt to correct for any inadequacy of the model which is revealed by a discrepancy between the observed data and the model predictions from even the best-fitting parameter values. The method is illustrated by using data from a nuclear radiation release at Tomsk, and from a more complex simulated nuclear accident exercise.",
keywords = "Calibration • Computer experiments • Deterministic models • Gaussian process • Interpolation • Model inadequacy • Sensitivity analysis • Uncertainty analysis",
author = "Wynn, {H. P.} and Brown, {P. J.} and C. Anderson and Rougier, {J. C.} and Diggle, {Peter J.} and M. Goldstein and Kendall, {W. S.} and P. Craig",
year = "2001",
doi = "10.1111/1467-9868.00294",
language = "English",
volume = "63",
pages = "450--464",
journal = "Journal of the Royal Statistical Society: Series B (Statistical Methodology)",
issn = "1369-7412",
publisher = "Wiley-Blackwell",
number = "3",

}

RIS

TY - JOUR

T1 - Comments on ‘Bayesian calibration of mathematical models’ by M. C. Kennedy and A. O’Hagan.

AU - Wynn, H. P.

AU - Brown, P. J.

AU - Anderson, C.

AU - Rougier, J. C.

AU - Diggle, Peter J.

AU - Goldstein, M.

AU - Kendall, W. S.

AU - Craig, P.

PY - 2001

Y1 - 2001

N2 - We consider prediction and uncertainty analysis for systems which are approximated using complex mathematical models. Such models, implemented as computer codes, are often generic in the sense that by a suitable choice of some of the model's input parameters the code can be used to predict the behaviour of the system in a variety of specific applications. However, in any specific application the values of necessary parameters may be unknown. In this case, physical observations of the system in the specific context are used to learn about the unknown parameters. The process of fitting the model to the observed data by adjusting the parameters is known as calibration. Calibration is typically effected by ad hoc fitting, and after calibration the model is used, with the fitted input values, to predict the future behaviour of the system. We present a Bayesian calibration technique which improves on this traditional approach in two respects. First, the predictions allow for all sources of uncertainty, including the remaining uncertainty over the fitted parameters. Second, they attempt to correct for any inadequacy of the model which is revealed by a discrepancy between the observed data and the model predictions from even the best-fitting parameter values. The method is illustrated by using data from a nuclear radiation release at Tomsk, and from a more complex simulated nuclear accident exercise.

AB - We consider prediction and uncertainty analysis for systems which are approximated using complex mathematical models. Such models, implemented as computer codes, are often generic in the sense that by a suitable choice of some of the model's input parameters the code can be used to predict the behaviour of the system in a variety of specific applications. However, in any specific application the values of necessary parameters may be unknown. In this case, physical observations of the system in the specific context are used to learn about the unknown parameters. The process of fitting the model to the observed data by adjusting the parameters is known as calibration. Calibration is typically effected by ad hoc fitting, and after calibration the model is used, with the fitted input values, to predict the future behaviour of the system. We present a Bayesian calibration technique which improves on this traditional approach in two respects. First, the predictions allow for all sources of uncertainty, including the remaining uncertainty over the fitted parameters. Second, they attempt to correct for any inadequacy of the model which is revealed by a discrepancy between the observed data and the model predictions from even the best-fitting parameter values. The method is illustrated by using data from a nuclear radiation release at Tomsk, and from a more complex simulated nuclear accident exercise.

KW - Calibration • Computer experiments • Deterministic models • Gaussian process • Interpolation • Model inadequacy • Sensitivity analysis • Uncertainty analysis

U2 - 10.1111/1467-9868.00294

DO - 10.1111/1467-9868.00294

M3 - Journal article

VL - 63

SP - 450

EP - 464

JO - Journal of the Royal Statistical Society: Series B (Statistical Methodology)

JF - Journal of the Royal Statistical Society: Series B (Statistical Methodology)

SN - 1369-7412

IS - 3

ER -