CASIA OpenIR  > 智能感知与计算研究中心
Aggregating Randomized Clustering-Promoting Invariant Projections for Domain Adaptation
Jian Liang1,2; Ran He1,2,3; Zhenan Sun1,2,3; Tieniu Tan1,2,3
Source PublicationIEEE Trans. Pattern Anal. Machine Intell.
2019
Volume41Issue:5Pages:1027-1042
Subtyperegular paper
Abstract

    Unsupervised domain adaptation aims to leverage the labeled source data to learn with the unlabeled target data. Previous trandusctive methods tackle it by iteratively seeking a low-dimensional projection to extract the invariant features and obtaining the pseudo target labels via building a classifier on source data. However, they merely concentrate on minimizing the cross-domain distribution divergence, while ignoring the intra-domain structure especially for the target domain. Even after projection, possible risk factors like imbalanced data distribution may still hinder the performance of target label inference. In this paper, we propose a simple yet effective domain-invariant projection ensemble approach to tackle these two issues together. Specifically, we seek the optimal projection via a novel relaxed domain-irrelevant clustering-promoting term that jointly bridges the cross-domain semantic gap and increases the intra-class compactness in both domains. To further enhance the target label inference, we first develop a `sampling-and-fusion' framework, under which multiple projections are independently learned based on various randomized coupled domain subsets. Subsequently, aggregating models such as majority voting are utilized to leverage multiple projections and classify unlabeled target data. 
    Extensive experimental results on six visual benchmarks including object, face, and digit images, demonstrate that the proposed methods gain remarkable margins over state-of-the-art unsupervised domain adaptation methods. 

KeywordUnsupervised Domain Adaptation Domain-invariant Projection Class-clustering Sampling-and-fusion
Indexed BySCI ; SCIE ; SSCI ; EI
Language英语
WOS IDWOS:000463607400001
Citation statistics
Cited Times:2[WOS]   [WOS Record]     [Related Records in WOS]
Document Type期刊论文
Identifierhttp://ir.ia.ac.cn/handle/173211/23803
Collection智能感知与计算研究中心
Corresponding AuthorZhenan Sun; Tieniu Tan
Affiliation1.Center for Research on Intelligent Perception and Computing, National Laboratory of Pattern Recognition (NLPR), Institute of Automation, Chinese Academy of Sciences (CASIA)
2.University of Chinese Academy of Sciences
3.CAS Center for Excellence in Brain Science and Intelligence Technology
First Author AffilicationChinese Acad Sci, Inst Automat, Natl Lab Pattern Recognit, Beijing 100190, Peoples R China
Corresponding Author AffilicationChinese Acad Sci, Inst Automat, Natl Lab Pattern Recognit, Beijing 100190, Peoples R China
Recommended Citation
GB/T 7714
Jian Liang,Ran He,Zhenan Sun,et al. Aggregating Randomized Clustering-Promoting Invariant Projections for Domain Adaptation[J]. IEEE Trans. Pattern Anal. Machine Intell.,2019,41(5):1027-1042.
APA Jian Liang,Ran He,Zhenan Sun,&Tieniu Tan.(2019).Aggregating Randomized Clustering-Promoting Invariant Projections for Domain Adaptation.IEEE Trans. Pattern Anal. Machine Intell.,41(5),1027-1042.
MLA Jian Liang,et al."Aggregating Randomized Clustering-Promoting Invariant Projections for Domain Adaptation".IEEE Trans. Pattern Anal. Machine Intell. 41.5(2019):1027-1042.
Files in This Item: Download All
File Name/Size DocType Version Access License
final.pdf(865KB)期刊论文作者接受稿开放获取CC BY-NC-SAView Download
Related Services
Recommend this item
Bookmark
Usage statistics
Export to Endnote
Google Scholar
Similar articles in Google Scholar
[Jian Liang]'s Articles
[Ran He]'s Articles
[Zhenan Sun]'s Articles
Baidu academic
Similar articles in Baidu academic
[Jian Liang]'s Articles
[Ran He]'s Articles
[Zhenan Sun]'s Articles
Bing Scholar
Similar articles in Bing Scholar
[Jian Liang]'s Articles
[Ran He]'s Articles
[Zhenan Sun]'s Articles
Terms of Use
No data!
Social Bookmark/Share
File name: final.pdf
Format: Adobe PDF
All comments (0)
No comment.
 

Items in the repository are protected by copyright, with all rights reserved, unless otherwise indicated.