March 16, 2012, Webb 1100
CalTech, Applied & Computational Mathematics and Control & Dynamical Systems
Although Uncertainty Quantification (UQ) appears to be an umbrella term, interactions with engineers show that behind this term lies a real challenge of practical importance that is not addressed by current methods in classical probability theory and statistics. We show how this challenge can be formulated as an optimization problem over an infinite dimensional (possibly non-separable) set of functions and measures of probability. These problems correspond to extremizing probabilities of failure, or of deviations, subject to the constraints imposed by the scenarios compatible with the assumptions and information. Although these optimization problems are extremely large, we show that under general conditions, they have finite-dimensional reductions. We show how these reduction theorems can be lead to Optimal Concentration Inequalities of Hoeffding and McDiarmid type or how they can be applied to complex systems such as the Caltech surrogate model for hypervelocity impact and on the seismic safety assessment of truss structures. The resulting optimal bounds on probabilities of failure/deviations show that uncertainties in input parameters, which propagate to output uncertainties in the classical sensitivity analysis paradigm, may not do so when the transfer functions (or probability distributions) are imperfectly known. When the available information includes legacy data (that need not carry information about the probability distribution of the system in operation), the solutions of these optimization problems depend non-trivially (even discontinuously) upon the specified data. Furthermore, the extreme values are often determined by only a few members of the data set; in our principal physically-motivated example, the optimal bounds are determined by just 2 out of 32 data points, and the remainder carry no information and could be neglected without changing the final answer. When the data carries information about the
probability distribution of the system in operation, the proposed framework can be generalized to lead to the scientific computation of optimal statistical estimators. This generalization is challenging in a major way: the optimization variables live in infinite dimensional spaces of measures over infinite dimensional spaces of measures and functions, whereas calculus on a computer is necessarily discrete and finite.
Various parts of this talk are joint work with C. Scovel, T. Sullivan, M.
McKerns, M. Ortiz, D. Meyer, and F. Theil.
Houman Owhadi, PhD, is an professor of Applied and Computational Mathematics and Control and Dynamical Systems at California Institute of Technology. His research interests are in homogenization and multiscale analysis, probability theory, stochastic mechanics, molecular dynamics and uncertainty quantification. He is the head of the uncertainty quantification group of the Caltech Predictive Science Academic Alliance Program.