CASIA OpenIR  > 紫东太初大模型研究中心
Exploiting Curriculum Learning in Unsupervised Neural Machine Translation
Lu JL(陆金梁)1,2; Zhang JJ(张家俊)1,2
2021-11
会议名称Findings of the Association for Computational Linguistics: EMNLP 2021
会议日期November 7–11, 2021
会议地点Online
出版者Association for Computational Linguistics
产权排序1
摘要

Back-translation (BT) has become one of the de facto components in unsupervised neural machine translation (UNMT), and it explicitly makes UNMT have translation ability. However, all the pseudo bi-texts generated by BT are treated equally as clean data during optimization without considering the quality diversity, leading to slow convergence and limited translation performance. To address this problem, we propose a curriculum learning method to gradually utilize pseudo bi-texts based on their quality from multiple granularities. Specifically, we first apply crosslingual word embedding to calculate the potential translation difficulty (quality) for the monolingual sentences. Then, the sentences are fed into UNMT from easy to hard batch by batch. Furthermore, considering the quality of sentences/tokens in a particular batch are also diverse, we further adopt the model itself to calculate the fine-grained quality scores, which are served as learning factors to balance the contributions of different parts when computing loss and encourage the UNMT model to focus on pseudo data with higher quality. Experimental results on WMT 14 En-Fr, WMT 16 En-De, WMT 16 En-Ro, and LDC En-Zh translation tasks demonstrate that the proposed method achieves consistent improvements with faster convergence speed.

收录类别EI
语种英语
是否为代表性论文
七大方向——子方向分类自然语言处理
国重实验室规划方向分类语音语言处理
是否有论文关联数据集需要存交
文献类型会议论文
条目标识符http://ir.ia.ac.cn/handle/173211/57388
专题紫东太初大模型研究中心
通讯作者Zhang JJ(张家俊)
作者单位1.National Laboratory of Pattern Recognition, Institute of Automation, CAS
2.School of Artificial Intelligence, University of Chinese Academy of Sciences
第一作者单位模式识别国家重点实验室
通讯作者单位模式识别国家重点实验室
推荐引用方式
GB/T 7714
Lu JL,Zhang JJ. Exploiting Curriculum Learning in Unsupervised Neural Machine Translation[C]:Association for Computational Linguistics,2021.
条目包含的文件 下载所有文件
文件名称/大小 文献类型 版本类型 开放类型 使用许可
2021.findings-emnlp.(866KB)会议论文 开放获取CC BY-NC-SA浏览 下载
个性服务
推荐该条目
保存到收藏夹
查看访问统计
导出为Endnote文件
谷歌学术
谷歌学术中相似的文章
[Lu JL(陆金梁)]的文章
[Zhang JJ(张家俊)]的文章
百度学术
百度学术中相似的文章
[Lu JL(陆金梁)]的文章
[Zhang JJ(张家俊)]的文章
必应学术
必应学术中相似的文章
[Lu JL(陆金梁)]的文章
[Zhang JJ(张家俊)]的文章
相关权益政策
暂无数据
收藏/分享
文件名: 2021.findings-emnlp.79.pdf
格式: Adobe PDF
所有评论 (0)
暂无评论
 

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。