Current results in bounding the expectation of convex functions in a single and in multiple dimensions are studied in the context of moment problems and semi-infinite programs. This allows the introduction of efficient upper bounds. Traditional bounds can be refined by incorporating explicit second-order information into the moment problem. We introduce a class of convex functions in one dimension, where this moment problem reduces to a constrained line search that can be solved using current superlinearly convergent techniques. Some functions in this class can be characterized by sufficient conditions on the first derivative. When the function is piecewise-linear with two slopes, we provide a closed form analytical solution to the moment problem with explicit second-order information. We implement these results to design an upper bounding scheme for general convex functions in one dimension over finite ranges. This upper bound is tighter than the traditional unrefined Edmundson-Madansky upper bound without significantly increasing the computational requirements. We use these results in conjunction with a sublinearization approximation to design an upper bound on the expectation of sublinear functions in multiple dimension without an independence condition on the r and om variables. The number of functional evaluations required is linear in the dimension of the problem. A stochastic program with fixed recourse is suggested as an example of the use of such a bounding method.
Ph.D.
Industrial engineering
University of Michigan
http://deepblue.lib.umich.edu/bitstream/2027.42/161384/1/8712102.pdf