CASIA OpenIR
The Strength of Nesterov's Extrapolation in the Individual Convergence of Nonsmooth Optimization
Tao, Wei1; Pan, Zhisong1; Wu, Gaowei2; Tao, Qing2,3
Source PublicationIEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS
ISSN2162-237X
2020-07-01
Volume31Issue:7Pages:2557-2568
Corresponding AuthorTao, Qing(taoqing@gmail.com)
AbstractThe extrapolation strategy raised by Nesterov, which can accelerate the convergence rate of gradient descent methods by orders of magnitude when dealing with smooth convex objective, has led to tremendous success in training machine learning tasks. In this article, the convergence of individual iterates of projected subgradient (PSG) methods for nonsmooth convex optimization problems is theoretically studied based on Nesterov's extrapolation, which we name individual convergence. We prove that Nesterov's extrapolation has the strength to make the individual convergence of PSG optimal for nonsmooth problems. In light of this consideration, a direct modification of the subgradient evaluation suffices to achieve optimal individual convergence for strongly convex problems, which can be regarded as making an interesting step toward the open question about stochastic gradient descent (SGD) posed by Shamir. Furthermore, we give an extension of the derived algorithms to solve regularized learning tasks with nonsmooth losses in stochastic settings. Compared with other state-of-the-art nonsmooth methods, the derived algorithms can serve as an alternative to the basic SGD especially in coping with machine learning problems, where an individual output is needed to guarantee the regularization structure while keeping an optimal rate of convergence. Typically, our method is applicable as an efficient tool for solving large-scale l(1)-regularized hinge-loss learning problems. Several comparison experiments demonstrate that our individual output not only achieves an optimal convergence rate but also guarantees better sparsity than the averaged solution.
KeywordConvergence Extrapolation Optimization Acceleration Machine learning Task analysis Machine learning algorithms Individual convergence machine learning Nesterov's extrapolation nonsmooth optimization sparsity
DOI10.1109/TNNLS.2019.2933452
Indexed BySCI
Language英语
Funding ProjectNSFC[61673394] ; National Key Research and Development Program of China[2016QY03D0501]
Funding OrganizationNSFC ; National Key Research and Development Program of China
WOS Research AreaComputer Science ; Engineering
WOS SubjectComputer Science, Artificial Intelligence ; Computer Science, Hardware & Architecture ; Computer Science, Theory & Methods ; Engineering, Electrical & Electronic
WOS IDWOS:000546986600027
PublisherIEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
Citation statistics
Document Type期刊论文
Identifierhttp://ir.ia.ac.cn/handle/173211/40051
Collection中国科学院自动化研究所
Corresponding AuthorTao, Qing
Affiliation1.Army Engn Univ PLA, Command & Control Engn Coll, Nanjing 210007, Peoples R China
2.Chinese Acad Sci, Inst Automat, Beijing, Peoples R China
3.Army Acad Artillery & Air Def, Hefei 230031, Peoples R China
Corresponding Author AffilicationInstitute of Automation, Chinese Academy of Sciences
Recommended Citation
GB/T 7714
Tao, Wei,Pan, Zhisong,Wu, Gaowei,et al. The Strength of Nesterov's Extrapolation in the Individual Convergence of Nonsmooth Optimization[J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS,2020,31(7):2557-2568.
APA Tao, Wei,Pan, Zhisong,Wu, Gaowei,&Tao, Qing.(2020).The Strength of Nesterov's Extrapolation in the Individual Convergence of Nonsmooth Optimization.IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS,31(7),2557-2568.
MLA Tao, Wei,et al."The Strength of Nesterov's Extrapolation in the Individual Convergence of Nonsmooth Optimization".IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS 31.7(2020):2557-2568.
Files in This Item:
There are no files associated with this item.
Related Services
Recommend this item
Bookmark
Usage statistics
Export to Endnote
Google Scholar
Similar articles in Google Scholar
[Tao, Wei]'s Articles
[Pan, Zhisong]'s Articles
[Wu, Gaowei]'s Articles
Baidu academic
Similar articles in Baidu academic
[Tao, Wei]'s Articles
[Pan, Zhisong]'s Articles
[Wu, Gaowei]'s Articles
Bing Scholar
Similar articles in Bing Scholar
[Tao, Wei]'s Articles
[Pan, Zhisong]'s Articles
[Wu, Gaowei]'s Articles
Terms of Use
No data!
Social Bookmark/Share
All comments (0)
No comment.
 

Items in the repository are protected by copyright, with all rights reserved, unless otherwise indicated.