Correcting boundary over-exploration deficiencies in Bayesian optimization with virtual derivative sign observations

04/04/2017
by   Eero Siivola, et al.
0

Bayesian optimization () is a global optimization strategy designed to find the minimum of an expensive black-box function, typically defined on a continuous subset of R^d, by using a Gaussian process () as a surrogate model for the objective. Although currently available acquisition functions address this goal with different degree of success, an over-exploration effect of the contour of the search space is typically observed. However, in problems like the configuration of machine learning algorithms, the function domain is conservatively large and with a high probability the global minimum does not sit the boundary. We propose a method to incorporate this knowledge into the searching process by adding virtual derivative observations in the at the borders of the search space. We use the properties of to impose conditions on the partial derivatives of the objective. The method is applicable with any acquisition function, it is easy to use and consistently reduces the number of evaluations required to optimize the objective irrespective of the acquisition used. We illustrate the benefits our approach in an extensive experimental comparison.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset