CASIA OpenIR  > 模式识别国家重点实验室  > 多媒体计算与图形学
Boosted Multifeature Learning for Cross-Domain Transfer
Yang, Xiaoshan1,2; Zhang, Tianzhu1,2; Xu, Changsheng1,2; Yang, Ming-Hsuan3; Xu CS(徐常胜)
Source PublicationACM TRANSACTIONS ON MULTIMEDIA COMPUTING COMMUNICATIONS AND APPLICATIONS
2015
Volume11Issue:3Pages:35:1-18
SubtypeArticle
AbstractConventional learning algorithm assumes that the training data and test data share a common distribution. However, this assumption will greatly hinder the practical application of the learned model for cross-domain data analysis in multimedia. To deal with this issue, transfer learning based technology should be adopted. As a typical version of transfer learning, domain adaption has been extensively studied recently due to its theoretical value and practical interest. In this article, we propose a boosted multifeature learning (BMFL) approach to iteratively learn multiple representations within a boosting procedure for unsupervised domain adaption. The proposed BMFL method has a number of properties. (1) It reuses all instances with different weights assigned by the previous boosting iteration and avoids discarding labeled instances as in conventional methods. (2) It models the instance weight distribution effectively by considering the classification error and the domain similarity, which facilitates learning new feature representation to correct the previously misclassified instances. (3) It learns multiple different feature representations to effectively bridge the source and target domains. We evaluate the BMFL by comparing its performance on three applications: image classification, sentiment classification and spam filtering. Extensive experimental results demonstrate that the proposed BMFL algorithm performs favorably against state-of-the-art domain adaption methods.
KeywordAlgorithms Experimentation Performance Domain Adaptation Multifeature Boosting Denoising Auto-encoder
WOS HeadingsScience & Technology ; Technology
DOI10.1145/2700286
WOS KeywordADAPTATION
Indexed BySCI
Language英语
WOS Research AreaComputer Science
WOS SubjectComputer Science, Information Systems ; Computer Science, Software Engineering ; Computer Science, Theory & Methods
WOS IDWOS:000349852500003
Citation statistics
Document Type期刊论文
Identifierhttp://ir.ia.ac.cn/handle/173211/8044
Collection模式识别国家重点实验室_多媒体计算与图形学
Corresponding AuthorXu CS(徐常胜)
Affiliation1.National Lab of Pattern Recognition, Institute of Automation, Chinese Academy of Sciences
2.China Singapore Inst Digital Media, Singapore 119613, Singapore
3.Univ Calif, Dept Elect Engn & Comp Sci, Merced, CA 95334 USA
Recommended Citation
GB/T 7714
Yang, Xiaoshan,Zhang, Tianzhu,Xu, Changsheng,et al. Boosted Multifeature Learning for Cross-Domain Transfer[J]. ACM TRANSACTIONS ON MULTIMEDIA COMPUTING COMMUNICATIONS AND APPLICATIONS,2015,11(3):35:1-18.
APA Yang, Xiaoshan,Zhang, Tianzhu,Xu, Changsheng,Yang, Ming-Hsuan,&徐常胜.(2015).Boosted Multifeature Learning for Cross-Domain Transfer.ACM TRANSACTIONS ON MULTIMEDIA COMPUTING COMMUNICATIONS AND APPLICATIONS,11(3),35:1-18.
MLA Yang, Xiaoshan,et al."Boosted Multifeature Learning for Cross-Domain Transfer".ACM TRANSACTIONS ON MULTIMEDIA COMPUTING COMMUNICATIONS AND APPLICATIONS 11.3(2015):35:1-18.
Files in This Item:
File Name/Size DocType Version Access License
Boosted Multifeature(1139KB)期刊论文作者接受稿开放获取CC BY-NC-SAView Application Full Text
Related Services
Recommend this item
Bookmark
Usage statistics
Export to Endnote
Google Scholar
Similar articles in Google Scholar
[Yang, Xiaoshan]'s Articles
[Zhang, Tianzhu]'s Articles
[Xu, Changsheng]'s Articles
Baidu academic
Similar articles in Baidu academic
[Yang, Xiaoshan]'s Articles
[Zhang, Tianzhu]'s Articles
[Xu, Changsheng]'s Articles
Bing Scholar
Similar articles in Bing Scholar
[Yang, Xiaoshan]'s Articles
[Zhang, Tianzhu]'s Articles
[Xu, Changsheng]'s Articles
Terms of Use
No data!
Social Bookmark/Share
File name: Boosted Multifeature Learning for Cross-Domain Transfer(论文).pdf
Format: Adobe PDF
All comments (0)
No comment.
 

Items in the repository are protected by copyright, with all rights reserved, unless otherwise indicated.