CASIA OpenIR

Browse/Search Results:  1-10 of 22 Help

Selected(0)Clear Items/Page:    Sort:
Target-Embedding Autoencoder With Knowledge Distillation for Multi-Label Classification 期刊论文
IEEE TRANSACTIONS ON EMERGING TOPICS IN COMPUTATIONAL INTELLIGENCE, 2024, 页码: 12
Authors:  Ma, Ying;  Zou, Xiaoyan;  Pan, Qizheng;  Yan, Ming;  Li, Guoqi
Favorite  |  View/Download:6/0  |  Submit date:2024/07/03
Multi-label classification  knowledge distillation  autoencoder  label embedding  
Explanation Guided Knowledge Distillation for Pre-trained Language Model Compression 期刊论文
ACM Transactions on Asian and Low-Resource Language Information Processing, 2024, 卷号: 23, 期号: 2, 页码: 1-19
Authors:  Zhao Yang;  Yuanzhe Zhang;  Dianbo Sui;  Yiming Ju;  Jun Zhao;  Kang Liu
Adobe PDF(1250Kb)  |  Favorite  |  View/Download:48/18  |  Submit date:2024/05/30
Explanation  knowledge distillation  model compression  
Reward Estimation with Scheduled Knowledge Distillation for Dialogue Policy Learning 期刊论文
Connection Science, 2023, 卷号: 35, 期号: 1, 页码: 2174078
Authors:  Qiu JY(邱俊彦);  Haidong Zhang;  Yiping Yang
Adobe PDF(831Kb)  |  Favorite  |  View/Download:50/18  |  Submit date:2024/05/29
reinforcement learning  dialogue policy learning  curriculum learning  knowledge distillation  
A Novel Tensor Decomposition-Based Efficient Detector for Low-Altitude Aerial Objects With Knowledge Distillation Scheme 期刊论文
IEEE/CAA Journal of Automatica Sinica, 2024, 卷号: 11, 期号: 2, 页码: 487-501
Authors:  Nianyin Zeng;  Xinyu Li;  Peishu Wu;  Han Li;  Xin Luo
Adobe PDF(12478Kb)  |  Favorite  |  View/Download:93/23  |  Submit date:2024/01/23
Attention mechanism  knowledge distillation (KD)  object detection  tensor decomposition (TD)  unmanned aerial vehicles (UAVs)  
Balanced knowledge distillation for long-tailed learning 期刊论文
NEUROCOMPUTING, 2023, 卷号: 527, 页码: 36-46
Authors:  Zhang, Shaoyu;  Chen, Chen;  Hu, Xiyuan;  Peng, Silong
Adobe PDF(2184Kb)  |  Favorite  |  View/Download:171/11  |  Submit date:2023/11/17
Long-tailed learning  Knowledge distillation  Vision and text classification  
Attention Weighted Local Descriptors 期刊论文
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2023, 卷号: 45, 期号: 9, 页码: 10632-10649
Authors:  Wang, Changwei;  Xu, Rongtao;  Lu, Ke;  Xu, Shibiao;  Meng, Weiliang;  Zhang, Yuyang;  Fan, Bin;  Zhang, Xiaopeng
Adobe PDF(8075Kb)  |  Favorite  |  View/Download:171/8  |  Submit date:2023/11/17
Local features detection and description  consistent attention mechanism  context augmentation  lightweight local descriptors  knowledge distillation  
A Closer Look at Self-Supervised Lightweight Vision Transformers 会议论文
, Honolulu, Hawaii, USA, 2023-7
Authors:  Wang, Shaoru;  Gao, Jin;  Li, Zeming;  Zhang, Xiaoqin;  Weiming, Hu
Adobe PDF(3478Kb)  |  Favorite  |  View/Download:256/74  |  Submit date:2023/09/20
Vision Transformer  Self-supervised Learning  Lightweight Networks  Knowledge Distillation  
PKD: General Distillation Framework for Object Detectors via Pearson Correlation Coefficient 会议论文
, New Orleans, America, Monday November 28th through Friday December 9th
Authors:  Weihan, Cao;  Yifan, Zhang;  Jianfei, Gao;  Anda, Cheng;  Ke, Cheng;  Jian, Cheng
Adobe PDF(2614Kb)  |  Favorite  |  View/Download:122/34  |  Submit date:2023/06/21
Knowledge Distillation  Model Compression  Object Detection  
Recovering Generalization via Pre-training-like Knowledge Distillation for Out-of-Distribution Visual Question Answering 期刊论文
IEEE Transactions on Multimedia, 2023, 卷号: 26, 页码: 1-15
Authors:  Song, Yaguang;  Yang, Xiaoshan;  Wang, Yaowei;  Xu, Changsheng
Adobe PDF(2397Kb)  |  Favorite  |  View/Download:199/50  |  Submit date:2023/06/12
Multi-modal Foundation Model  Out-of-Distribution Generalization  Visual Question Answering  Knowledge Distillation  
Cross-Architecture Knowledge Distillation 会议论文
INTERNATIONAL JOURNAL OF COMPUTER VISION, Macau SAR, China, 2022.12.4-2022.12.8
Authors:  Yufan Liu;  Jiajiong Cao;  Bing Li;  Weiming Hu;  Jingting Ding;  Liang Li
Adobe PDF(1020Kb)  |  Favorite  |  View/Download:173/50  |  Submit date:2023/04/23
Knowledge distillation  Cross architecture  Model compression  Deep learning