Fast Hyperparameter Tuning for Ising Machines
In this paper, we propose a novel technique to accelerate Ising machines hyperparameter tuning. Firstly, we define Ising machine performance and explain the goal of hyperparameter tuning in regard to this performance definition. Secondly, we compare well-known hyperparameter tuning techniques, namely random sampling and Tree-structured Parzen Estimator (TPE) on different combinatorial optimization problems. Thirdly, we propose a new convergence acceleration method for TPE which we call "FastConvergence".It aims at limiting the number of required TPE trials to reach best performing hyperparameter values combination. We compare FastConvergence to previously mentioned well-known hyperparameter tuning techniques to show its effectiveness. For experiments, well-known Travel Salesman Problem (TSP) and Quadratic Assignment Problem (QAP) instances are used as input. The Ising machine used is Fujitsu's third generation Digital Annealer (DA). Results show, in most cases, FastConvergence can reach similar results to TPE alone within less than half the number of trials.
READ FULL TEXT