Knowledge Commons of Institute of Automation,CAS
Attention Calibration for Transformer in Neural Machine Translation | |
Yu, Lu1,2![]() ![]() | |
2021-08 | |
会议名称 | The 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (ACL-IJNLP 2021) |
会议日期 | 2021-8 |
会议地点 | 线上 |
摘要 | Attention mechanisms have achieved substantial improvements in neural machine translation by dynamically selecting relevant inputs for different predictions. However, recent studies have questioned the attention mechanisms’ capability for discovering decisive inputs. In this paper, we propose to calibrate the attention weights by introducing a mask perturbation model that automatically evaluates each input’s contribution to the model outputs. We increase the attention weights assigned to the indispensable tokens, whose removal leads to a dramatic performance decrease. The extensive experiments on the Transformer-based translation have demonstrated the effectiveness of our model. We further find that the calibrated attention weights are more uniform at lower layers to collect multiple information while more concentrated on the specific inputs at higher layers. Detailed analyses also show a great need for calibration in the attention weights with high entropy where the model is unconfident about its decision. |
关键词 | 神经机器翻译 |
学科门类 | 工学 ; 工学::计算机科学与技术(可授工学、理学学位) |
URL | 查看原文 |
收录类别 | EI |
语种 | 英语 |
七大方向——子方向分类 | 自然语言处理 |
国重实验室规划方向分类 | 可解释人工智能 |
是否有论文关联数据集需要存交 | 否 |
文献类型 | 会议论文 |
条目标识符 | http://ir.ia.ac.cn/handle/173211/51839 |
专题 | 多模态人工智能系统全国重点实验室_自然语言处理 |
通讯作者 | Jiajun, Zhang |
作者单位 | 1.National Laboratory of Pattern Recognition, Institute of Automation, Chinese Academy of Sciences 2.School of Artificial Intelligence, University of Chinese Academy of Sciences 3.Tencent Cloud Xiaowei |
第一作者单位 | 模式识别国家重点实验室 |
通讯作者单位 | 模式识别国家重点实验室 |
推荐引用方式 GB/T 7714 | Yu, Lu,Jiali Zeng,Jiajun, Zhang,et al. Attention Calibration for Transformer in Neural Machine Translation[C],2021. |
条目包含的文件 | 下载所有文件 | |||||
文件名称/大小 | 文献类型 | 版本类型 | 开放类型 | 使用许可 | ||
2021.acl-long.103.pd(749KB) | 会议论文 | 开放获取 | CC BY-NC-SA | 浏览 下载 |
除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。
修改评论