CASIA OpenIR  > 模式识别国家重点实验室  > 自然语言处理
Exploiting Pre-Ordering for Neural Machine Translation
Zhao, Yang; Zhang, Jiajun; Zong, Chengqing
2018
会议名称LREC
会议日期2018-5
会议地点Japan
摘要

Neural Machine Translation (NMT) has drawn much attention due to its promising translation performance in recent years. However, the under-translation and over-translation problem still remain a big challenge. Through error analysis, we find that under-translation is much more prevalent than over-translation and the source words that need to be reordered during translation are more likely to be ignored. To address the under-translation problem, we explore the pre-ordering approach for NMT. Specifically, we pre-order the source sentences to approximate the target language word order. We then combine the pre-ordering model with position embedding to enhance the monotone translation. Finally, we augment our model with the coverage mechanism to tackle the over-translation problem. Experimental results on Chinese-to-English translation have shown that our method can significantly improve the translation quality by up to 2.43 BLEU points. Furthermore, the detailed analysis demonstrates that our approach can substantially reduce the number of under-translation cases by 30.4% (compared to 17.4% using the coverage model).

文献类型会议论文
条目标识符http://ir.ia.ac.cn/handle/173211/23205
专题模式识别国家重点实验室_自然语言处理
作者单位中国科学院自动化研究所
第一作者单位中国科学院自动化研究所
推荐引用方式
GB/T 7714
Zhao, Yang,Zhang, Jiajun,Zong, Chengqing. Exploiting Pre-Ordering for Neural Machine Translation[C],2018.
条目包含的文件
条目无相关文件。
个性服务
推荐该条目
保存到收藏夹
查看访问统计
导出为Endnote文件
谷歌学术
谷歌学术中相似的文章
[Zhao, Yang]的文章
[Zhang, Jiajun]的文章
[Zong, Chengqing]的文章
百度学术
百度学术中相似的文章
[Zhao, Yang]的文章
[Zhang, Jiajun]的文章
[Zong, Chengqing]的文章
必应学术
必应学术中相似的文章
[Zhao, Yang]的文章
[Zhang, Jiajun]的文章
[Zong, Chengqing]的文章
相关权益政策
暂无数据
收藏/分享
所有评论 (0)
暂无评论
 

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。