CASIA OpenIR

浏览/检索结果: 共2条,第1-2条 帮助

限定条件    
已选(0)清除 条数/页:   排序方式:
Deep Neural Network Self-Distillation Exploiting Data Representation Invariance 期刊论文
IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2022, 卷号: 33, 期号: 1, 页码: 257-269
作者:  Xu, Ting-Bing;  Liu, Cheng-Lin
收藏  |  浏览/下载:178/0  |  提交时间:2022/02/16
Training  Nonlinear distortion  Data models  Neural networks  Knowledge engineering  Network architecture  Generalization error  network compression  representation invariance  self-distillation (SD)  
Dynamical Channel Pruning by Conditional Accuracy Change for Deep Neural Networks 期刊论文
IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2020, 卷号: 无, 期号: 无, 页码: 无
作者:  Chen, Zhiqiang;  Xu, Ting-Bing;  Du, Changde;  Liu, Cheng-Lin;  He, Huiguang
浏览  |  Adobe PDF(4352Kb)  |  收藏  |  浏览/下载:267/62  |  提交时间:2021/01/27
Conditional accuracy change (CAC), direct criterion, dynamical channel pruning, neural network compression, structure shaping.