Few-shots Parameter Tuning via Co-evolution

07/01/2020
by   Ke Tang, et al.
0

Generalization, i.e., the ability of addressing problem instances that are not available during the system design and development phase, is a critical goal for intelligent systems. A typical way to achieve good generalization is to exploit vast data to train a model. In the context of heuristic search, such a paradigm is termed parameter tuning or algorithm configuration, i.e., configuring the parameters of a search method based on a set of "training" problem instances. However, compared to its counterpart in machine learning, parameter tuning could more often suffer from the lack of training instances, and the obtained configuration may fail to generalize. This paper suggests competitive co-evolution as a remedy to this challenge and proposes a framework named Co-Evolution of Parameterized Search (CEPS). By alternately evolving a configuration population and an instance population, CEPS is capable of obtaining generalizable configurations with few training instances. The advantage of CEPS in improving generalization is analytically shown. Two concrete instantiations, namely CEPS-TSP and CEPS-VRPSPDTW, are also presented for the Traveling Salesman Problem (TSP) and the Vehicle Routing Problem with Simultaneous Pickup-Delivery and Time Windows (VRPSPDTW), respectively. Computational results on the two problems confirm the advantages of CEPS over state-of-the-art parameter tuning methods.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset