CASIA OpenIR

浏览/检索结果: 共5条,第1-5条 帮助

限定条件    
已选(0)清除 条数/页:   排序方式:
Robust Multitask Learning With Sample Gradient Similarity 期刊论文
IEEE TRANSACTIONS ON SYSTEMS MAN CYBERNETICS-SYSTEMS, 2023, 页码: 10
作者:  Peng, Xinyu;  Chang, Cheng;  Wang, Fei-Yue;  Li, Li
收藏  |  浏览/下载:71/0  |  提交时间:2023/12/21
Deep learning  Automation  multitask learning  sample gradient  sample reweighting  task reweighting  
MixGradient: A gradient-based re-weighting scheme with mixup for imbalanced data streams 期刊论文
NEURAL NETWORKS, 2023, 卷号: 161, 页码: 525-534
作者:  Peng, Xinyu;  Wang, Fei-Yue;  Li, Li
收藏  |  浏览/下载:46/0  |  提交时间:2023/11/17
Deep learning  Imbalanced data streams  Sample gradient  Typical samples  Mixup  
Towards Better Generalization of Deep Neural Networks via Non-Typicality Sampling Scheme 期刊论文
IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2022, 页码: 11
作者:  Peng, Xinyu;  Wang, Fei-Yue;  Li, Li
收藏  |  浏览/下载:163/0  |  提交时间:2022/06/06
Training  Estimation  Deep learning  Standards  Optimization  Noise measurement  Convergence  Deep learning  generalization performance  nontypicality sampling scheme  stochastic gradient descent (SGD)  
Drill the Cork of Information Bottleneck by Inputting the Most Important Data 期刊论文
IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2021, 页码: 13
作者:  Peng, Xinyu;  Zhang, Jiawei;  Wang, Fei-Yue;  Li, Li
收藏  |  浏览/下载:187/0  |  提交时间:2022/01/27
Training  Signal to noise ratio  Mutual information  Optimization  Convergence  Deep learning  Tools  Information bottleneck (IB) theory  machine learning  minibatch stochastic gradient descent (SGD)  typicality sampling  
Accelerating Minibatch Stochastic Gradient Descent Using Typicality Sampling 期刊论文
IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2020, 卷号: 31, 期号: 11, 页码: 4649-4659
作者:  Peng, Xinyu;  Li, Li;  Wang, Fei-Yue
收藏  |  浏览/下载:222/0  |  提交时间:2021/01/06
Training  Convergence  Approximation algorithms  Stochastic processes  Estimation  Optimization  Acceleration  Batch selection  machine learning  minibatch stochastic gradient descent (SGD)  speed of convergence