CASIA OpenIR
(本次检索基于用户作品认领结果)

浏览/检索结果: 共6条,第1-6条 帮助

限定条件                
已选(0)清除 条数/页:   排序方式:
基于动态稀疏和特征学习增强的模型剪枝 期刊论文
中国科学:技术科学, 2022, 卷号: 52, 期号: 5, 页码: 667-681
作者:  阮晓峰;  胡卫明;  刘雨帆;  李兵
Adobe PDF(2674Kb)  |  收藏  |  浏览/下载:5/0  |  提交时间:2024/06/05
Learning from the raw domain: cross modality distillation for compressed video action recognition 会议论文
, Rhodes, Greece, 2023.6
作者:  Yufan Liu;  Jiajiong Cao;  Weiming Bai;  Bing Li;  Weiming Hu
Adobe PDF(411Kb)  |  收藏  |  浏览/下载:306/97  |  提交时间:2023/05/06
Learning to Explore Distillability and Sparsability: A Joint Framework for Model Compression 期刊论文
IEEE Transactions on Pattern Analysis and Machine Intelligence (T-PAMI), 2022, 卷号: 45, 期号: 3, 页码: 3378-3395
作者:  Yufan Liu;  Jiajiong Cao;  Bing Li;  Weiming Hu;  Stephen Maybank
Adobe PDF(3314Kb)  |  收藏  |  浏览/下载:147/40  |  提交时间:2023/04/24
Cross-Architecture Knowledge Distillation 会议论文
INTERNATIONAL JOURNAL OF COMPUTER VISION, Macau SAR, China, 2022.12.4-2022.12.8
作者:  Yufan Liu;  Jiajiong Cao;  Bing Li;  Weiming Hu;  Jingting Ding;  Liang Li
Adobe PDF(1020Kb)  |  收藏  |  浏览/下载:144/43  |  提交时间:2023/04/23
Knowledge distillation  Cross architecture  Model compression  Deep learning  
EDP: An Efficient Decomposition and Pruning Scheme for Convolutional Neural Network Compression 期刊论文
IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2021, 卷号: 32, 期号: 10, 页码: 4499-4513
作者:  Ruan, Xiaofeng;  Liu, Yufan;  Yuan, Chunfeng;  Li, Bing;  Hu, Weiming;  Li, Yangxi;  Maybank, Stephen
Adobe PDF(3625Kb)  |  收藏  |  浏览/下载:322/43  |  提交时间:2021/06/17
Data-driven  low-rank decomposition  model compression and acceleration  structured pruning  
DPFPS: Dynamic and Progressive Filter Pruning for Compressing Convolutional Neural Networks from Scratch 会议论文
, virtual conference, 2021.2.2-2021.2.9
作者:  Ruan, Xiaofeng;  Liu, Yufan;  Li, Bing;  Yuan, Chunfeng;  Hu, Weiming
Adobe PDF(652Kb)  |  收藏  |  浏览/下载:279/56  |  提交时间:2021/06/17