Asymptotic formulas for the derivatives of probability functions and their Monte Carlo estimations
One of the key problems in chance constrained programming for nonlinear optimization problems is the evaluation of derivatives of joint probability functions of the form . Here is the vector of physical parameters, is a random vector describing the uncertainty of the model, is the constraints mapping, and is the vector of constraint levels. In this paper specific Monte Carlo tools for the estimations of the gradient and Hessian of P(x) are proposed when the input random vector [Lambda] has a multivariate normal distribution and small variances. Using the small variance hypothesis, approximate expressions for the first and secondorder derivatives are obtained, whose Monte Carlo estimations have low computational costs. The number of calls of the constraints mapping g for the proposed estimators of the gradient and Hessian of P(x) is only 1+2Nx+2N[Lambda]. These tools are implemented in penalized optimization routines adapted to stochastic optimization, and are shown to reduce the computational cost of chance constrained programming substantially.
Year of publication: 
2009


Authors:  Garnier, Josselin ; Omrane, Abdennebi ; Rouchdy, Youssef 
Published in: 
European Journal of Operational Research.  Elsevier, ISSN 03772217.  Vol. 198.2009, 3, p. 848858

Publisher: 
Elsevier 
Keywords:  Applied probability Monte Carlo methods Stochastic programming Optimization with constraints Random constraints 
Saved in favorites
Similar items by person

Asymptotic formulas for the derivatives of probability functions and their Monte Carlo estimations
Garnier, Josselin, (2009)

Asymptotic formulas for the derivates of probability functions and their Monte Carlo estimations
Garnier, Josselin, (2009)

Anticipations effects in endogeneous probabilitymigration models
GarĂ§on, Manuel, (2010)
 More ...