CASIA OpenIR

浏览/检索结果: 共3条,第1-3条 帮助

限定条件    
已选(0)清除 条数/页:   排序方式:
Explanation Guided Knowledge Distillation for Pre-trained Language Model Compression 期刊论文
ACM Transactions on Asian and Low-Resource Language Information Processing, 2024, 卷号: 23, 期号: 2, 页码: 1-19
作者:  Zhao Yang;  Yuanzhe Zhang;  Dianbo Sui;  Yiming Ju;  Jun Zhao;  Kang Liu
Adobe PDF(1250Kb)  |  收藏  |  浏览/下载:51/18  |  提交时间:2024/05/30
Explanation  knowledge distillation  model compression  
Cross-Architecture Knowledge Distillation 会议论文
INTERNATIONAL JOURNAL OF COMPUTER VISION, Macau SAR, China, 2022.12.4-2022.12.8
作者:  Yufan Liu;  Jiajiong Cao;  Bing Li;  Weiming Hu;  Jingting Ding;  Liang Li
Adobe PDF(1020Kb)  |  收藏  |  浏览/下载:174/50  |  提交时间:2023/04/23
Knowledge distillation  Cross architecture  Model compression  Deep learning  
CADN: A weakly supervised learning-based category-aware object detection network for surface defect detection 期刊论文
Pattern Recognition, 2020, 卷号: 109, 期号: 0, 页码: 10
作者:  Zou W(邹伟)
浏览  |  Adobe PDF(1839Kb)  |  收藏  |  浏览/下载:178/52  |  提交时间:2020/10/22
Weakly supervised learning, Automated surface inspection, Defect detection, Knowledge distillation