Sequential Optimization in Locally Important Dimensions
Optimizing a black-box function is challenging when the underlying function is non-linear and dependent on a large number of input variables. Expected improvement (EI) algorithms balance exploration of the design space and identification of the global maximizer, but they struggle in high dimensions. Reducing the dimension of the design space to include only the most important variables improves estimation of the optimum and leads to faster identification of the global maximizer. Current variable selection techniques for computer experiments are global, meaning a variable is either included or excluded from the fitted model describing the unknown function. In local neighborhoods around the global maximizer, only a few of the globally active variables are important. In this paper, we incorporate Bayesian global and local variable selection procedures within a sequential design and analysis approach called Sequential Optimization in Locally Important Dimensions (SOLID) to efficiently identify the global maximizer of a high-dimensional functions. A simulation study across various test functions shows that SOLID outperforms standard applications of EI and finds better estimates of the global maximizer in a fixed number of sequential evaluations.
READ FULL TEXT