Solving Stochastic Optimization by Newton-type methods with Dimension-Adaptive Sparse Grid Quadrature

02/21/2022
by   Yuancheng Zhou, et al.
0

Stochastic optimisation problems minimise expectations of random cost functions. We use 'optimise then discretise' method to solve stochastic optimisation. In our approach, accurate quadrature methods are required to calculate the objective, gradient or Hessian which are in fact integrals. We apply the dimension-adaptive sparse grid quadrature to approximate these integrals when the problem is high dimensional. Dimension-adaptive sparse grid quadrature shows high accuracy and efficiency in computing an integral with a smooth integrand. It is a kind of generalisation of the classical sparse grid method, which refines different dimensions according to their importance. We show that the dimension-adaptive sparse grid quadrature has better performance in the optimise then discretise' method than the 'discretise then optimise' method.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset