CASIA OpenIR  > 数字内容技术与服务研究中心  > 听觉模型与认知计算
DE2: Dynamic ensemble of ensembles for learning nonstationary data
Yin, Xu-Cheng1; Huang, Kaizhu2; Hao, Hong-Wei3
Source PublicationNEUROCOMPUTING
2015-10-01
Volume165Pages:14-22
SubtypeArticle
AbstractLearning nonstationary data with concept drift has received much attention in machine learning and been an active topic in ensemble learning. Specifically, batch growing ensemble methods present one important direction for dealing with concept drift involved in nonstationary data. However, current batch growing ensemble methods combine all the available component classifiers only, each trained independently from a batch of non-stationary data. They simply discard interim ensembles and hence may lose useful information obtained from the fine-tuned interim ensembles. Distinctively, we introduce a comprehensive hierarchical approach called Dynamic Ensemble of Ensembles (DE2). The novel method combines classifiers as an ensemble of all the interim ensembles dynamically from consecutive batches of nonstationary data. DE2 includes two key stages: component classifiers and interim ensembles are dynamically trained; and the final ensemble is then learned by exponentially-weighted averaging with available experts, i.e., interim ensembles. Moreover, we engage Sparsity Learning to choose component classifiers selectively and intelligently. We also incorporate the techniques of Dynamic Weighted Majority, and Learn(++).NSE for better integrating different classifiers dynamically. We perform experiments with two benchmark test sets in real nonstationary environments, and compare our DE2 method to other conventional competitive ensemble methods. Experimental results confirm that our approach consistently leads to better performance and has promising generalization ability for learning in nonstationary environments. (C) 2015 Elsevier B.V. All rights reserved.
KeywordEnsemble Of Ensembles Growing Ensemble Sparsity Learning Nonstationary Environment Concept Drift Incremental Learning
WOS HeadingsScience & Technology ; Technology
WOS KeywordWEIGHTED-MAJORITY ; NEURAL-NETWORKS ; CLASSIFIER ENSEMBLES ; CONCEPT DRIFT ; ENVIRONMENTS ; ALGORITHM ; TRACKING ; SUPPORT ; MEMORY
Indexed BySCI
Language英语
WOS Research AreaComputer Science
WOS SubjectComputer Science, Artificial Intelligence
WOS IDWOS:000356747700003
Citation statistics
Cited Times:9[WOS]   [WOS Record]     [Related Records in WOS]
Document Type期刊论文
Identifierhttp://ir.ia.ac.cn/handle/173211/7904
Collection数字内容技术与服务研究中心_听觉模型与认知计算
Affiliation1.Univ Sci & Technol Beijing, Sch Comp & Commun Engn, Dept Comp Sci & Technol, Beijing 100083, Peoples R China
2.Xian Jiaotong Liverpool Univ, Dept Elect & Elect Engn, Suzhou 215123, Peoples R China
3.Chinese Acad Sci, Inst Automat, Beijing 100190, Peoples R China
Recommended Citation
GB/T 7714
Yin, Xu-Cheng,Huang, Kaizhu,Hao, Hong-Wei. DE2: Dynamic ensemble of ensembles for learning nonstationary data[J]. NEUROCOMPUTING,2015,165:14-22.
APA Yin, Xu-Cheng,Huang, Kaizhu,&Hao, Hong-Wei.(2015).DE2: Dynamic ensemble of ensembles for learning nonstationary data.NEUROCOMPUTING,165,14-22.
MLA Yin, Xu-Cheng,et al."DE2: Dynamic ensemble of ensembles for learning nonstationary data".NEUROCOMPUTING 165(2015):14-22.
Files in This Item:
There are no files associated with this item.
Related Services
Recommend this item
Bookmark
Usage statistics
Export to Endnote
Google Scholar
Similar articles in Google Scholar
[Yin, Xu-Cheng]'s Articles
[Huang, Kaizhu]'s Articles
[Hao, Hong-Wei]'s Articles
Baidu academic
Similar articles in Baidu academic
[Yin, Xu-Cheng]'s Articles
[Huang, Kaizhu]'s Articles
[Hao, Hong-Wei]'s Articles
Bing Scholar
Similar articles in Bing Scholar
[Yin, Xu-Cheng]'s Articles
[Huang, Kaizhu]'s Articles
[Hao, Hong-Wei]'s Articles
Terms of Use
No data!
Social Bookmark/Share
All comments (0)
No comment.
 

Items in the repository are protected by copyright, with all rights reserved, unless otherwise indicated.