This was part of
Optimal Control and Decision Making Under Uncertainty for Digital Twins
Robust Bayesian Optimal Experimental Design under Model Misspecification
Tommie Catanach, Sandia Livermore
Tuesday, October 28, 2025
Abstract:
Bayesian Optimal Experimental Design (BOED) has become a powerful tool for improving uncertainty quantification by strategically guiding data collection. However, the reliability of BOED depends critically on the validity of its underlying assumptions and the possibility of model discrepancy. In practice, the chosen data acquisition strategy may inadvertently reinforce prior assumptions or fail to consider model uncertainty, leading to biased inferences. These biases can be particularly severe because BOED often targets extreme parameter regions as the most “informative,” potentially magnifying the impact of model error.
In this talk, we present a new information criterion, Expected Generalized Information Gain (EGIG), that explicitly accounts for model discrepancy in BOED. EGIG augments standard Expected Information Gain by balancing the trade-off between experiment performance (i.e., how much information is gained) and robustness (i.e., how susceptible the design is to model misspecification). Concretely, EGIG measures how poorly inference under an incorrect model might perform, compared to a more appropriate model for the experiment. We will discuss the theoretical underpinnings of EGIG by putting into a broader axiomatic framework for robust information objectives, as well as practical algorithms for incorporating it into BOED for nonlinear inference problems. These methods handle both quantifiable discrepancies (e.g., low-fidelity vs. high-fidelity models), unknown discrepancies represented by a distribution of potential errors, and errors due to the inference procedure itself, thereby enhancing the robustness and reliability of BOED in real-world settings.
SNL is managed and operated by NTESS under DOE NNSA contract DE-NA0003525.