Event Description
Paul Dupuis, Brown University
Abstract: We consider the problem of characterizing system
performance and optimization and control when there are parts of a stochastic
system that cannot or will not be precisely identified. Specifically, we formulate the general problem as quantifying errors that result when a probabilistic
“nominal” model P (which we consider as the design model, or the computational model),
is used in place of the “true” model Q. The focus is on how these problems may be addressed using
information divergences between Q and P, and in particular how variational
formulas may be used to obtain bounds for the difference between performance
measures under Q (the true) and under P (the design) in terms of the divergence.
The
most natural example of such a variational formula is that which links
exponential integrals, ordinary integrals, and relative entropy (or Kullback-Leibler
divergence). We review some useful
properties of this duality, and discuss how it can be used to study sensitivity
bounds, optimization under model uncertainty, and related issues. We also recall its relation to H-infinity
control, a well-known deterministic method for handling model form
uncertainty. The last part of the talk
will focus on limitations of the use of relative entropy in this context and
describe alternative divergences that can be used to (possibly) overcome some
of these limitations. |