Calibration, validation, and sensitivity analysis: What's what

T. G. Trucano, L. P. Swiler, T. Igusa, W. L. Oberkampf, M. Pilch

Research output: Contribution to journalArticlepeer-review

Abstract

One very simple interpretation of calibration is to adjust a set of parameters associated with a computational science and engineering code so that the model agreement is maximized with respect to a set of experimental data. One very simple interpretation of validation is to quantify our belief in the predictive capability of a computational code through comparison with a set of experimental data. Uncertainty in both the data and the code are important and must be mathematically understood to correctly perform both calibration and validation. Sensitivity analysis, being an important methodology in uncertainty analysis, is thus important to both calibration and validation. In this paper, we intend to clarify the language just used and express some opinions on the associated issues. We will endeavor to identify some technical challenges that must be resolved for successful validation of a predictive modeling capability. One of these challenges is a formal description of a "model discrepancy" term. Another challenge revolves around the general adaptation of abstract learning theory as a formalism that potentially encompasses both calibration and validation in the face of model uncertainty.

Original languageEnglish (US)
Pages (from-to)1331-1357
Number of pages27
JournalReliability Engineering and System Safety
Volume91
Issue number10-11
DOIs
StatePublished - Oct 2006

ASJC Scopus subject areas

  • Safety, Risk, Reliability and Quality
  • Industrial and Manufacturing Engineering

Fingerprint

Dive into the research topics of 'Calibration, validation, and sensitivity analysis: What's what'. Together they form a unique fingerprint.

Cite this