OpenSeq2Seq: extensible toolkit for distributed and mixed precision training of sequence-to-sequence models

05/25/2018
by   Oleksii Kuchaiev, et al.
0

We present OpenSeq2Seq -- an open-source toolkit for training sequence-to-sequence models. The main goal of our toolkit is to allow researchers to most effectively explore different sequence-to-sequence architectures. The efficiency is achieved by fully supporting distributed and mixed-precision training. OpenSeq2Seq provides building blocks for training encoder-decoder models for neural machine translation and automatic speech recognition. We plan to extend it with other modalities in the future.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset