Knowledge Commons of Institute of Automation,CAS
Effectively training neural machine translation models with monolingual data | |
Yang, Zhen1,2; Chen, Wei2; Wang, Feng2; Xu, Bo2 | |
发表期刊 | NEUROCOMPUTING |
ISSN | 0925-2312 |
2019-03-14 | |
卷号 | 333页码:240-247 |
通讯作者 | Yang, Zhen(yangzhen2014@ia.ac.cn) |
摘要 | Improving neural machine translation models (NMT) with monolingual data has aroused more and more interests in this area and back-translation for monolingual data augmentation Sennrich et al. (2016) has been taken as a promising development recently. While the naive back-translation approach improves the translation performance substantially, we notice that its usage for monolingual data is not so effective because traditional NMT models make no distinction between the true parallel corpus and the back translated synthetic parallel corpus. This paper proposes a gate-enhanced NMT model which makes use of monolingual data more effectively. The central idea is to separate the data flow of monolingual data and parallel data into different channels by the elegant designed gate, which enables the model to perform different transformations according to the type of the input sequence, i.e., monolingual data and parallel data. Experiments on Chinese-English and English-German translation tasks show that our approach achieves substantial improvements over strong baselines and the gate-enhanced NMT model can utilize the source-side and target-side monolingual data at the same time. (C) 2018 Elsevier B.V. All rights reserved. |
关键词 | Neural machine translation Monolingual data Gate-enhanced Source-side and target-side Effectively |
DOI | 10.1016/j.neucom.2018.12.032 |
关键词[WOS] | NETWORK |
收录类别 | SCI |
语种 | 英语 |
资助项目 | National Program on Key Basic Research Project of China (973 Program)[2013CB329302] ; National Program on Key Basic Research Project of China (973 Program)[2013CB329302] |
项目资助者 | National Program on Key Basic Research Project of China (973 Program) |
WOS研究方向 | Computer Science |
WOS类目 | Computer Science, Artificial Intelligence |
WOS记录号 | WOS:000456834100022 |
出版者 | ELSEVIER SCIENCE BV |
七大方向——子方向分类 | 自然语言处理 |
引用统计 | |
文献类型 | 期刊论文 |
条目标识符 | http://ir.ia.ac.cn/handle/173211/25314 |
专题 | 复杂系统认知与决策实验室_听觉模型与认知计算 |
通讯作者 | Yang, Zhen |
作者单位 | 1.Univ Chinese Acad Sci, Beijing, Peoples R China 2.Chinese Acad Sci, Inst Automat, 95 ZhongGuanCun East Rd, Beijing 100190, Peoples R China |
第一作者单位 | 中国科学院自动化研究所 |
通讯作者单位 | 中国科学院自动化研究所 |
推荐引用方式 GB/T 7714 | Yang, Zhen,Chen, Wei,Wang, Feng,et al. Effectively training neural machine translation models with monolingual data[J]. NEUROCOMPUTING,2019,333:240-247. |
APA | Yang, Zhen,Chen, Wei,Wang, Feng,&Xu, Bo.(2019).Effectively training neural machine translation models with monolingual data.NEUROCOMPUTING,333,240-247. |
MLA | Yang, Zhen,et al."Effectively training neural machine translation models with monolingual data".NEUROCOMPUTING 333(2019):240-247. |
条目包含的文件 | 条目无相关文件。 |
除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。
修改评论