Model-free Neural Counterfactual Regret Minimization with Bootstrap Learning

12/03/2020
by   Weiming Liu, et al.
5

Counterfactual Regret Minimization (CFR) has achieved many fascinating results in solving large scale Imperfect Information Games (IIGs). Neural CFR is one of the promising techniques that can effectively reduce the computation and memory consumption of CFR by generalizing decision information between similar states. However, current neural CFR algorithms have to approximate the cumulative variables in iterations with neural networks, which usually results in large estimation variance given the huge complexity of IIGs. Moreover, model-based sampling and inefficient training make current neural CFR algorithms still computationally expensive. In this paper, a new model-free neural CFR algorithm with bootstrap learning is proposed, in which, a Recursive Substitute Value (RSV) network is trained to replace the cumulative variables in CFR. The RSV is defined recursively and can be estimated independently in every iteration using bootstrapping. Then there is no need to track or approximate the cumulative variables any more. Based on the RSV, the new neural CFR algorithm is model-free and has higher training efficiency. Experimental results show that the new algorithm can match the state-of-the-art neural CFR algorithms and with less training cost.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset