Learning Compression from Limited Unlabeled Data
He, Xiangyu1,2; Cheng, Jian1,2,3
2018
会议名称European Conference on Computer Vision (ECCV)
会议日期September 8 – 14, 2018
会议地点Munich, Germany
摘要

Convolutional neural networks (CNNs) have dramatically advanced the state-of-art in a number of domains. However, most models are both computation and memory intensive, which arouse the interest of network compression. While existing compression methods achieve good performance, they suffer from three limitations:
1) the inevitable retraining with enormous labeled data; 2) the massive GPU hours for retraining; 3) the training tricks for model compression. Especially the requirement of retraining on original datasets makes it difficult to apply in many real-world scenarios, where training data is not publicly available. In this paper, we reveal that re-normalization is the practical and effective way to alleviate the above limitations. Through quantization or pruning, most methods may compress a large number of parameters but ignore the core role in performance degradation, which is the Gaussian conjugate prior induced by batch normalization. By employing the re-estimated statistics in batch normalization, we significantly improve the accuracy of compressed CNNs. Extensive experiments on ImageNet show it outperforms baselines by a large margin and is comparable to label-based methods. Besides, the fine-tuning process takes less than 5 minutes on CPU, using 1000 unlabeled images.

收录类别EI
语种英语
文献类型会议论文
条目标识符http://ir.ia.ac.cn/handle/173211/48584
专题复杂系统认知与决策实验室_高效智能计算与学习
通讯作者Cheng, Jian
作者单位1.National Laboratory of Pattern Recognition, Institute of Automation, Chinese Academy of Sciences
2.University of Chinese Academy of Sciences
3.Center for Excellence in Brain Science and Intelligence Technology
第一作者单位模式识别国家重点实验室
通讯作者单位模式识别国家重点实验室
推荐引用方式
GB/T 7714
He, Xiangyu,Cheng, Jian. Learning Compression from Limited Unlabeled Data[C],2018.
条目包含的文件 下载所有文件
文件名称/大小 文献类型 版本类型 开放类型 使用许可
Xiangyu_He_Learning_(504KB)会议论文 开放获取CC BY-NC-SA浏览 下载
个性服务
推荐该条目
保存到收藏夹
查看访问统计
导出为Endnote文件
谷歌学术
谷歌学术中相似的文章
[He, Xiangyu]的文章
[Cheng, Jian]的文章
百度学术
百度学术中相似的文章
[He, Xiangyu]的文章
[Cheng, Jian]的文章
必应学术
必应学术中相似的文章
[He, Xiangyu]的文章
[Cheng, Jian]的文章
相关权益政策
暂无数据
收藏/分享
文件名: Xiangyu_He_Learning_Compression_from_ECCV_2018_paper.pdf
格式: Adobe PDF
所有评论 (0)
暂无评论
 

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。