Our paper on Bayesian calibration of multi-level model with unobservable distributed response got published in MSSP journal. This is a joint effort with group members of Prof. Todd and Prof. Conte from UCSD and Prof. Parno from Dartmouth college. Get the paper here.
This paper proposes a Bayesian calibration framework for multi-level simulation models to calibrate an unobservable distributed model using measurements of an observable model. In the proposed framework, the distributed model discrepancy of an unobservable model with distributed response is first represented as a series of orthogonal polynomials, with the polynomial coefficients modelled by surrogate models with unknown hyper-parameters. A two-phase machine learning method is then developed to construct surrogate models of the polynomial coefficients based on measurements of an observable model. The constructed model discrepancy is finally used to update the uncertain model parameters by following a modularized Bayesian calibration scheme. The developed framework is applied to the joint Bayesian calibration of the uncertain gap length and unobservable and distributed boundary condition model for a miter gate problem. Results of the miter gate application demonstrate the efficacy of the proposed framework. Get the paper here.