Speaker: Ralph Tyrrell Rockafellar (University of Washington)
Time: Dec 6, 2019, 10:00-11:30
Location: Conference Room 415, Hui Yuan 3#
Ralph Tyrrell Rockafellar, born in 1935, studied mathematics at Harvard University, earning his bachelor and PhD degrees in 1957 and 1963. He was Professor at the University of Washington (Seattle) from 1971 to 2002. He is professor emeritus at the University of Washington (Seattle) since 2003. He was selected to the class of fellow of the Institute for Operations Research and the Management Sciences. For his contributions to convex optimization, nonsmooth analysis, and stochastic programming, Rockafellar received George B. Dantzig Prize in 1982, the Frederick W. Lanchester Prize in 1997 and was awarded the John von Neumann Theory Prize in 1999. He is one of the leading scholars in optimization theory and related fields of analysis and combinatorics. His monographs "Convex analysis" and "Variational Analysis" are the classic textbooks for scholars engaged in nonlinear analysis and optimization theory.
Problems of optimization are concerned with making decisions "optimally". However, in many situations in management, finance and engineering, decisions have to be made without knowing fully how they will play out in the future. When the future is modeled probabilistically, this leads to stochastic optimization, yet the formulation of objectives and constraints can be far from obvious. A future cost or hazard variable may be a random variable which a present decision can influence to some extent, but maybe only in shaping its distribution in a limited way. For instance, it may be desirable to keep a hazard below a particular threshold, like building a bridge to resist earthquakes and floods, and yet it may be impossible or too expensive to guarantee that the threshold will never be breached. One needs to have a standard according to which a cost or hazard is "adequately" below the desired threshold in line with its probability distribution. That is the role for so-called "measures of risk," which started to be developed for purposes like assessing the solvency of banks but now are being utilized much more widely. Measures of risk also offer fresh ways of dealing with reliability constraints, such as have traditionally been imposed in engineering in terms of bounds on the probability of failure of various manufactured components. Probability of failure has troublesome mathematical behavior in an optimization environment. Now, though, there is a substitute, called buffered probability of failure, which makes better sense and is much easier to work with computationally.