Applying Reinforcement Learning (RL) to sequence generation models enabl...
Two-Tower Vision-Language (VL) models have shown promising improvements ...
Using translation memories (TMs) as prompts is a promising approach to
i...
Learning multiscale Transformer models has been evidenced as a viable
ap...
In this paper, we propose a novel architecture, the Enhanced Interactive...
Multiscale feature hierarchies have been witnessed the success in the
co...
Previous work on multimodal machine translation (MMT) has focused on the...
As graph databases become widespread, JTC1 – the committee in joint char...
This paper describes NiuTrans neural machine translation systems of the ...
This paper describes the submissions of the NiuTrans Team to the WNGT 20...
This paper describes the NiuTrans system for the WMT21 translation effic...
It has been found that residual networks are an Euler discretization of
...
Recently, deep models have shown tremendous improvements in neural machi...
Deep encoders have been proven to be effective in improving neural machi...
Knowledge distillation has been proven to be effective in model accelera...
In encoder-decoder neural models, multiple encoders are in general used ...
Transformer is the state-of-the-art model in recent machine translation
...