CSpace  > 中国科学院计算技术研究所期刊论文  > 英文
Sequence-Level Training for Non-Autoregressive Neural Machine Translation
Shao, Chenze1; Feng, Yang1; Zhang, Jinchao2; Meng, Fandong2; Zhou, Jie2
2021-12-01
发表期刊COMPUTATIONAL LINGUISTICS
ISSN0891-2017
卷号47期号:4页码:891-925
摘要In recent years, Neural Machine Translation (NMT) has achieved notable results in various translation tasks. However, the word-by-word generation manner determined by the autoregressive mechanism leads to high translation latency of the NMT and restricts its low-latency applications. Non-Autoregressive Neural Machine Translation (NAT) removes the autoregressive mechanism and achieves significant decoding speedup by generating target words independently and simultaneously. Nevertheless, NAT still takes the word-level cross-entropy loss as the training objective, which is not optimal because the output of NAT cannot be properly evaluated due to the multimodality problem. In this article, we propose using sequence-level training objectives to train NAT models, which evaluate the NAT outputs as a whole and correlates well with the real translation quality. First, we propose training NAT models to optimize sequence-level evaluation metrics (e.g., BLEW based on several novel reinforcement algorithms customized for NAT, which outperform the conventional method by reducing the variance of gradient estimation. Second, we introduce a novel training objective for NAT models, which aims to minimize the Bag-of-N-grams (BoN) difference between the model output and the reference sentence. The BoN training objective is differentiable and can be calculated efficiently without doing any approximations. Finally, we apply a three-stage training strategy to combine these two methods to train the NAT model. We validate our approach on four translation tasks (WMT14 EN <-> De, WMT16 EN <-> Ro), which shows that our approach largely outperforms NAT baselines and achieves remarkable performance on all translation tasks. The source code is available at https://github.com/ictnlp/Seq-NAT.
DOI10.1162/COLI_a_00421
收录类别SCI
语种英语
WOS研究方向Computer Science ; Linguistics
WOS类目Computer Science, Artificial Intelligence ; Computer Science, Interdisciplinary Applications ; Linguistics ; Language & Linguistics
WOS记录号WOS:000753228200006
出版者MIT PRESS
引用统计
被引频次:7[WOS]   [WOS记录]     [WOS相关记录]
文献类型期刊论文
条目标识符http://119.78.100.204/handle/2XEOYT63/19009
专题中国科学院计算技术研究所期刊论文_英文
通讯作者Shao, Chenze
作者单位1.Chinese Acad Sci, Inst Comp Technol, Key Lab Intelligent Informat Proc, Beijing, Peoples R China
2.Tencent Inc, WeChat AI, Pattern Recognit Ctr, Shenzhen, Peoples R China
推荐引用方式
GB/T 7714
Shao, Chenze,Feng, Yang,Zhang, Jinchao,et al. Sequence-Level Training for Non-Autoregressive Neural Machine Translation[J]. COMPUTATIONAL LINGUISTICS,2021,47(4):891-925.
APA Shao, Chenze,Feng, Yang,Zhang, Jinchao,Meng, Fandong,&Zhou, Jie.(2021).Sequence-Level Training for Non-Autoregressive Neural Machine Translation.COMPUTATIONAL LINGUISTICS,47(4),891-925.
MLA Shao, Chenze,et al."Sequence-Level Training for Non-Autoregressive Neural Machine Translation".COMPUTATIONAL LINGUISTICS 47.4(2021):891-925.
条目包含的文件
条目无相关文件。
个性服务
推荐该条目
保存到收藏夹
查看访问统计
导出为Endnote文件
谷歌学术
谷歌学术中相似的文章
[Shao, Chenze]的文章
[Feng, Yang]的文章
[Zhang, Jinchao]的文章
百度学术
百度学术中相似的文章
[Shao, Chenze]的文章
[Feng, Yang]的文章
[Zhang, Jinchao]的文章
必应学术
必应学术中相似的文章
[Shao, Chenze]的文章
[Feng, Yang]的文章
[Zhang, Jinchao]的文章
相关权益政策
暂无数据
收藏/分享
所有评论 (0)
暂无评论
 

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。