CASIA OpenIR
EA-LSTM: Evolutionary attention-based LSTM for time series prediction
Li, Youru1,2; Zhu, Zhenfeng1,2; Kong, Deqiang3; Han, Hua4,5; Zhao, Yao1,2
Source PublicationKNOWLEDGE-BASED SYSTEMS
ISSN0950-7051
2019-10-01
Volume181Pages:8
Corresponding AuthorZhu, Zhenfeng(zhfzhu@bjtu.edu.cn)
AbstractTime series prediction with deep learning methods, especially Long Short-term Memory Neural Network (LSTM), have scored significant achievements in recent years. Despite the fact that LSTM can help to capture long-term dependencies, its ability to pay different degree of attention on sub-window feature within multiple time-steps is insufficient. To address this issue, an evolutionary attention-based LSTM training with competitive random search is proposed for multivariate time series prediction. By transferring shared parameters, an evolutionary attention learning approach is introduced to LSTM. Thus, like that for biological evolution, the pattern for importance-based attention sampling can be confirmed during temporal relationship mining. To refrain from being trapped into partial optimization like traditional gradient-based methods, an evolutionary computation inspired competitive random search method is proposed, which can well configure the parameters in the attention layer. Experimental results have illustrated that the proposed model can achieve competetive prediction performance compared with other baseline methods. (C) 2019 Elsevier B.V. All rights reserved.
KeywordEvolutionary computation Deep neural network Time series prediction
DOI10.1016/j.knosys.2019.05.028
Indexed BySCI
Language英语
Funding ProjectNational Key Research and Development of China[2016YFB0800404] ; National Natural Science Foundation of China[61572068] ; National Natural Science Foundation of China[61532005] ; Special Program of Beijing Municipal Science & Technology Commission[Z181100000118002] ; Strategic Priority Research Program of Chinese Academy of Science[XDB32030200] ; Fundamental Research Funds for the Central Universities of China[2018YJS032]
Funding OrganizationNational Key Research and Development of China ; National Natural Science Foundation of China ; Special Program of Beijing Municipal Science & Technology Commission ; Strategic Priority Research Program of Chinese Academy of Science ; Fundamental Research Funds for the Central Universities of China
WOS Research AreaComputer Science
WOS SubjectComputer Science, Artificial Intelligence
WOS IDWOS:000484873600005
PublisherELSEVIER
Citation statistics
Document Type期刊论文
Identifierhttp://ir.ia.ac.cn/handle/173211/27224
Collection中国科学院自动化研究所
Corresponding AuthorZhu, Zhenfeng
Affiliation1.Beijing Jiaotong Univ, Inst Informat Sci, Beijing 100044, Peoples R China
2.Beijing Key Lab Adv Informat Sci & Network Techno, Beijing 100044, Peoples R China
3.Microsoft Multimedia, Beijing 100080, Peoples R China
4.Chinese Acad Sci, Inst Automat, Natl Lab Pattern Recognit, Beijing 100190, Peoples R China
5.CAS Ctr Excellence Brain Sci & Intelligence Techn, Shanghai 200031, Peoples R China
Recommended Citation
GB/T 7714
Li, Youru,Zhu, Zhenfeng,Kong, Deqiang,et al. EA-LSTM: Evolutionary attention-based LSTM for time series prediction[J]. KNOWLEDGE-BASED SYSTEMS,2019,181:8.
APA Li, Youru,Zhu, Zhenfeng,Kong, Deqiang,Han, Hua,&Zhao, Yao.(2019).EA-LSTM: Evolutionary attention-based LSTM for time series prediction.KNOWLEDGE-BASED SYSTEMS,181,8.
MLA Li, Youru,et al."EA-LSTM: Evolutionary attention-based LSTM for time series prediction".KNOWLEDGE-BASED SYSTEMS 181(2019):8.
Files in This Item:
There are no files associated with this item.
Related Services
Recommend this item
Bookmark
Usage statistics
Export to Endnote
Google Scholar
Similar articles in Google Scholar
[Li, Youru]'s Articles
[Zhu, Zhenfeng]'s Articles
[Kong, Deqiang]'s Articles
Baidu academic
Similar articles in Baidu academic
[Li, Youru]'s Articles
[Zhu, Zhenfeng]'s Articles
[Kong, Deqiang]'s Articles
Bing Scholar
Similar articles in Bing Scholar
[Li, Youru]'s Articles
[Zhu, Zhenfeng]'s Articles
[Kong, Deqiang]'s Articles
Terms of Use
No data!
Social Bookmark/Share
All comments (0)
No comment.
 

Items in the repository are protected by copyright, with all rights reserved, unless otherwise indicated.