Knowledge Commons of Institute of Automation,CAS
Hardware Acceleration of Fully Quantized BERT for Efficient Natural Language Processing | |
Liu, Zejian1,2![]() ![]() ![]() | |
2021-02 | |
会议名称 | Design, Automation and Test in Europe Conference and Exhibition |
会议录名称 | Proceedings of the 2021 Design, Automation and Test in Europe, DATE 2021 |
页码 | 513-516 |
会议日期 | 2021-2 |
会议地点 | Virtual, Online |
摘要 | BERT is the most recent Transformer-based model that achieves state-of-the-art performance in various NLP tasks. In this paper, we investigate the hardware acceleration of BERT on FPGA for edge computing. To tackle the issue of huge computational complexity and memory footprint, we propose to fully quantize the BERT (FQ-BERT), including weights, activations, softmax, layer normalization, and all the intermediate results. Experiments demonstrate that the FQ-BERT can achieve 7.94×compression for weights with negligible performance loss. We then propose an accelerator tailored for the FQ-BERT and evaluate on Xilinx ZCU102 and ZCU111 FPGA. It can achieve a performance-per-watt of 3.18 fps/W, which is 28.91× and 12.72× over Intel(R) Core(TM) i7-8700 CPU and NVIDIA K80 GPU, respectively. |
学科门类 | 工学::计算机科学与技术(可授工学、理学学位) |
DOI | 10.23919/DATE51398.2021.9474043 |
URL | 查看原文 |
收录类别 | EI |
语种 | 英语 |
是否为代表性论文 | 是 |
七大方向——子方向分类 | AI芯片与智能计算 |
国重实验室规划方向分类 | 其他 |
是否有论文关联数据集需要存交 | 否 |
引用统计 | |
文献类型 | 会议论文 |
条目标识符 | http://ir.ia.ac.cn/handle/173211/52034 |
专题 | 复杂系统认知与决策实验室_高效智能计算与学习 |
通讯作者 | Cheng, Jian |
作者单位 | 1.National Laboratory of Pattern Recognition, Institute of Automation, Chinese Academy of Sciences 2.School of Future Technology, University of Chinese Academy of Sciences |
第一作者单位 | 模式识别国家重点实验室 |
通讯作者单位 | 模式识别国家重点实验室 |
推荐引用方式 GB/T 7714 | Liu, Zejian,Li, Gang,Cheng, Jian. Hardware Acceleration of Fully Quantized BERT for Efficient Natural Language Processing[C],2021:513-516. |
条目包含的文件 | 下载所有文件 | |||||
文件名称/大小 | 文献类型 | 版本类型 | 开放类型 | 使用许可 | ||
Hardware_Acceleratio(593KB) | 会议论文 | 开放获取 | CC BY-NC-SA | 浏览 |
除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。
修改评论