Retrosynthesis with Attention-Based NMT Model and Chemical Analysis of the "Wrong" Predictions

08/02/2019
by   Hongliang Duan, et al.
0

We cast retrosynthesis as a machine translation problem by introducing a special Tensor2Tensor, an entire attention-based and fully data-driven model. Given a data set comprising about 50,000 diverse reactions extracted from USPTO patents, the model significantly outperforms seq2seq model (34.7 accuracy by achieving 54.1 batch size and training time are thoroughly investigated to train the model. Additionally, we offer a novel insight into the causes of grammatically invalid SMILES, and conduct a test in which experienced chemists pick out and analyze the "wrong" predictions that may be chemically plausible but differ from the ground truth. Actually, the effectiveness of our model is un-derestimated and the "true" top-1 accuracy can reach to 64.6

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset