CASIA OpenIR  > 模式识别国家重点实验室  > 自然语言处理
One Sentence One Model for Neural Machine Translation
Li, Xiaoqing; Zhang, Jiajun; Zong, Chengqing
2018
会议名称LREC
会议日期2018-5
会议地点Japan
摘要

Neural machine translation (NMT) becomes a new state of the art and achieves promising translation performance using a simple encoder-decoder neural network. This neural network is trained once on the parallel corpus and the fixed network is used to translate all the test sentences. We argue that the general fixed network parameters cannot best fit each specific testing sentences. In this paper, we propose the dynamic NMT which learns a general network as usual, and then fine-tunes the network for each test sentence. The fine-tune work is done on a small set of the bilingual training data that is obtained through similarity search according to the test sentence. Extensive experiments demonstrate that this method can significantly improve the translation performance, especially when highly similar sentences are available.

语种英语
文献类型会议论文
条目标识符http://ir.ia.ac.cn/handle/173211/23204
专题模式识别国家重点实验室_自然语言处理
作者单位中国科学院自动化研究所
第一作者单位中国科学院自动化研究所
推荐引用方式
GB/T 7714
Li, Xiaoqing,Zhang, Jiajun,Zong, Chengqing. One Sentence One Model for Neural Machine Translation[C],2018.
条目包含的文件
条目无相关文件。
个性服务
推荐该条目
保存到收藏夹
查看访问统计
导出为Endnote文件
谷歌学术
谷歌学术中相似的文章
[Li, Xiaoqing]的文章
[Zhang, Jiajun]的文章
[Zong, Chengqing]的文章
百度学术
百度学术中相似的文章
[Li, Xiaoqing]的文章
[Zhang, Jiajun]的文章
[Zong, Chengqing]的文章
必应学术
必应学术中相似的文章
[Li, Xiaoqing]的文章
[Zhang, Jiajun]的文章
[Zong, Chengqing]的文章
相关权益政策
暂无数据
收藏/分享
所有评论 (0)
暂无评论
 

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。