CASIA OpenIR
(本次检索基于用户作品认领结果)

浏览/检索结果: 共2条,第1-2条 帮助

限定条件                            
已选(0)清除 条数/页:   排序方式:
Explanation Guided Knowledge Distillation for Pre-trained Language Model Compression 期刊论文
ACM Transactions on Asian and Low-Resource Language Information Processing, 2024, 卷号: 23, 期号: 2, 页码: 1-19
作者:  Zhao Yang;  Yuanzhe Zhang;  Dianbo Sui;  Yiming Ju;  Jun Zhao;  Kang Liu
Adobe PDF(1250Kb)  |  收藏  |  浏览/下载:60/21  |  提交时间:2024/05/30
Explanation  knowledge distillation  model compression  
Unsupervised Dialogue State Tracking for End-to-End Task-Oriented Dialogue with a Multi-Span Prediction Network 期刊论文
JOURNAL OF COMPUTER SCIENCE AND TECHNOLOGY, 2023, 卷号: 38, 期号: 4, 页码: 834-852
作者:  Liu, Qing-Bin;  He, Shi-Zhu;  Liu, Cao;  Liu, Kang;  Zhao, Jun
收藏  |  浏览/下载:63/0  |  提交时间:2024/02/22
end-to-end task-oriented dialogue  dialogue state tracking (DST)  unsupervised learning  reinforcement learning