Effectively training neural machine translation models with monolingual data
Yang, Zhen1,2; Chen, Wei2; Wang, Feng2; Xu, Bo2
发表期刊NEUROCOMPUTING
ISSN0925-2312
2019-03-14
卷号333页码:240-247
通讯作者Yang, Zhen(yangzhen2014@ia.ac.cn)
摘要Improving neural machine translation models (NMT) with monolingual data has aroused more and more interests in this area and back-translation for monolingual data augmentation Sennrich et al. (2016) has been taken as a promising development recently. While the naive back-translation approach improves the translation performance substantially, we notice that its usage for monolingual data is not so effective because traditional NMT models make no distinction between the true parallel corpus and the back translated synthetic parallel corpus. This paper proposes a gate-enhanced NMT model which makes use of monolingual data more effectively. The central idea is to separate the data flow of monolingual data and parallel data into different channels by the elegant designed gate, which enables the model to perform different transformations according to the type of the input sequence, i.e., monolingual data and parallel data. Experiments on Chinese-English and English-German translation tasks show that our approach achieves substantial improvements over strong baselines and the gate-enhanced NMT model can utilize the source-side and target-side monolingual data at the same time. (C) 2018 Elsevier B.V. All rights reserved.
关键词Neural machine translation Monolingual data Gate-enhanced Source-side and target-side Effectively
DOI10.1016/j.neucom.2018.12.032
关键词[WOS]NETWORK
收录类别SCI
语种英语
资助项目National Program on Key Basic Research Project of China (973 Program)[2013CB329302] ; National Program on Key Basic Research Project of China (973 Program)[2013CB329302]
项目资助者National Program on Key Basic Research Project of China (973 Program)
WOS研究方向Computer Science
WOS类目Computer Science, Artificial Intelligence
WOS记录号WOS:000456834100022
出版者ELSEVIER SCIENCE BV
七大方向——子方向分类自然语言处理
引用统计
被引频次:10[WOS]   [WOS记录]     [WOS相关记录]
文献类型期刊论文
条目标识符http://ir.ia.ac.cn/handle/173211/25314
专题复杂系统认知与决策实验室_听觉模型与认知计算
通讯作者Yang, Zhen
作者单位1.Univ Chinese Acad Sci, Beijing, Peoples R China
2.Chinese Acad Sci, Inst Automat, 95 ZhongGuanCun East Rd, Beijing 100190, Peoples R China
第一作者单位中国科学院自动化研究所
通讯作者单位中国科学院自动化研究所
推荐引用方式
GB/T 7714
Yang, Zhen,Chen, Wei,Wang, Feng,et al. Effectively training neural machine translation models with monolingual data[J]. NEUROCOMPUTING,2019,333:240-247.
APA Yang, Zhen,Chen, Wei,Wang, Feng,&Xu, Bo.(2019).Effectively training neural machine translation models with monolingual data.NEUROCOMPUTING,333,240-247.
MLA Yang, Zhen,et al."Effectively training neural machine translation models with monolingual data".NEUROCOMPUTING 333(2019):240-247.
条目包含的文件
条目无相关文件。
个性服务
推荐该条目
保存到收藏夹
查看访问统计
导出为Endnote文件
谷歌学术
谷歌学术中相似的文章
[Yang, Zhen]的文章
[Chen, Wei]的文章
[Wang, Feng]的文章
百度学术
百度学术中相似的文章
[Yang, Zhen]的文章
[Chen, Wei]的文章
[Wang, Feng]的文章
必应学术
必应学术中相似的文章
[Yang, Zhen]的文章
[Chen, Wei]的文章
[Wang, Feng]的文章
相关权益政策
暂无数据
收藏/分享
所有评论 (0)
暂无评论
 

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。