CASIA OpenIR

浏览/检索结果: 共2条,第1-2条 帮助

限定条件        
已选(0)清除 条数/页:   排序方式:
Accelerating Minibatch Stochastic Gradient Descent Using Typicality Sampling 期刊论文
IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2020, 卷号: 31, 期号: 11, 页码: 4649-4659
作者:  Peng, Xinyu;  Li, Li;  Wang, Fei-Yue
收藏  |  浏览/下载:206/0  |  提交时间:2021/01/06
Training  Convergence  Approximation algorithms  Stochastic processes  Estimation  Optimization  Acceleration  Batch selection  machine learning  minibatch stochastic gradient descent (SGD)  speed of convergence  
Stability-Based Generalization Analysis of Distributed Learning Algorithms for Big Data 期刊论文
IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2020, 卷号: 31, 期号: 3, 页码: 801-812
作者:  Wu, Xinxing;  Zhang, Junping;  Wang, Fei-Yue
收藏  |  浏览/下载:188/0  |  提交时间:2020/06/02
Big data  distributed learning algorithms  distributed simulations  generalization