CASIA OpenIR

浏览/检索结果: 共20条,第1-10条 帮助

限定条件    
已选(0)清除 条数/页:   排序方式:
Multi-granularity Distillation Scheme Towards Lightweight Semi-supervised Semantic Segmentation 会议论文
, 以色列特拉维夫, 10.23-10.27
作者:  Jie Qin;  Jie Wu;  Ming Li;  Xuefeng Xiao;  Min Zheng;  Xingang Wang
Adobe PDF(4167Kb)  |  收藏  |  浏览/下载:26/10  |  提交时间:2024/06/04
CKDF: Cascaded Knowledge Distillation Framework for Robust Incremental Learning 期刊论文
IEEE TRANSACTIONS ON IMAGE PROCESSING, 2022, 卷号: 31, 页码: 3825–3837
作者:  Li KC(李焜炽);  Wan J(万军);  Yu S(余山)
Adobe PDF(3813Kb)  |  收藏  |  浏览/下载:64/13  |  提交时间:2024/05/28
Rethinking Confidence Calibration for Failure Prediction 会议论文
, Virtual, October 23-27, 2022
作者:  Fei Zhu;  Zhen Cheng;  Xu-Yao Zhang;  Cheng-Lin Liu
Adobe PDF(10583Kb)  |  收藏  |  浏览/下载:318/209  |  提交时间:2023/09/12
Second-Order Global Attention Networks for Graph Classification and Regression 会议论文
, Beijing, China, August 27-28, 2022
作者:  Hu Fenyu;  Cui Zeyu;  Wu Shu;  Liu Qiang;  Wu Jinlin;  Wang Liang;  Tan Tieniu
Adobe PDF(69424Kb)  |  收藏  |  浏览/下载:224/73  |  提交时间:2023/07/06
PKD: General Distillation Framework for Object Detectors via Pearson Correlation Coefficient 会议论文
, New Orleans, America, Monday November 28th through Friday December 9th
作者:  Weihan, Cao;  Yifan, Zhang;  Jianfei, Gao;  Anda, Cheng;  Ke, Cheng;  Jian, Cheng
Adobe PDF(2614Kb)  |  收藏  |  浏览/下载:123/34  |  提交时间:2023/06/21
Knowledge Distillation  Model Compression  Object Detection  
Improving Extreme Low-bit Quantization with Soft Threshold 期刊论文
IEEE Transactions on Circuits and Systems for Video Technology, 2022, 页码: 1549 - 1563
作者:  Xu WX(许伟翔);  Wang PS(王培松);  Cheng J(程健)
Adobe PDF(2414Kb)  |  收藏  |  浏览/下载:84/31  |  提交时间:2023/06/20
基于可解释性分析的神经机器翻译不确定性研究 学位论文
, 2022
作者:  卢宇
Adobe PDF(3886Kb)  |  收藏  |  浏览/下载:135/12  |  提交时间:2023/06/02
神经机器翻译  数据不确定性  注意力机制不确定性  预测不确定性  
Contrastive Knowledge Transfer for Deepfake Detection with Limited Data 会议论文
, Montreal, QC, Canada, 2022.08.21-2022.08.25
作者:  Li, Dongze;  Zhuo, Wenqi;  Wang, Wei;  Dong, Jing
Adobe PDF(1186Kb)  |  收藏  |  浏览/下载:204/52  |  提交时间:2023/05/31
Learning to Explore Distillability and Sparsability: A Joint Framework for Model Compression 期刊论文
IEEE Transactions on Pattern Analysis and Machine Intelligence (T-PAMI), 2022, 卷号: 45, 期号: 3, 页码: 3378-3395
作者:  Yufan Liu;  Jiajiong Cao;  Bing Li;  Weiming Hu;  Stephen Maybank
Adobe PDF(3314Kb)  |  收藏  |  浏览/下载:173/46  |  提交时间:2023/04/24
Cross-Architecture Knowledge Distillation 会议论文
INTERNATIONAL JOURNAL OF COMPUTER VISION, Macau SAR, China, 2022.12.4-2022.12.8
作者:  Yufan Liu;  Jiajiong Cao;  Bing Li;  Weiming Hu;  Jingting Ding;  Liang Li
Adobe PDF(1020Kb)  |  收藏  |  浏览/下载:174/50  |  提交时间:2023/04/23
Knowledge distillation  Cross architecture  Model compression  Deep learning