CASIA OpenIR  > 智能感知与计算研究中心
Two-Step Greedy Subspace Clustering
Lingxiao Song1,2; Man Zhang1,2; Zhenan Sun1,2; Jian Liang1,2; Ran He(赫然)1,2; Song, Lingxiao
2015
会议名称Pacific-Rim Conference on Multimedia (PCM)
会议录名称Lecture Notes in Computer Science
会议日期2015-9
会议地点Gwangju, Korea
摘要Greedy subspace clustering methods provide an efficient way to cluster large-scale multimedia datasets. However, these methods do not guarantee a global optimum and their clustering performance mainly depends on their initializations. To alleviate this initialization problem, this paper proposes a two-step greedy strategy by exploring proper neighbors that span an initial subspace. Firstly, for each data point, we seek a sparse representation with respect to its nearest neighbors. The data points corresponding to nonzero entries in the learning representation form an initial subspace, which potentially rejects bad or redundant data points. Secondly, the subspace is updated by adding an orthogonal basis involved with the newly added data points. Experimental results on real-world applications demonstrate that our method can significantly improve the clustering accuracy of greedy subspace clustering methods without scarifying much computational time.
关键词Greedy Subspace Clustering Sparse Representation Subspace Neighbor
收录类别EI
文献类型会议论文
条目标识符http://ir.ia.ac.cn/handle/173211/11620
专题智能感知与计算研究中心
通讯作者Song, Lingxiao
作者单位1.Center for Research on Intelligent Perception and Computing
2.Institute of Automation, Chinese Academy of Sciences, Beijing, China
第一作者单位中国科学院自动化研究所
推荐引用方式
GB/T 7714
Lingxiao Song,Man Zhang,Zhenan Sun,et al. Two-Step Greedy Subspace Clustering[C],2015.
条目包含的文件
文件名称/大小 文献类型 版本类型 开放类型 使用许可
Two-step Greedy Subs(254KB)会议论文 开放获取CC BY-NC-SA浏览
个性服务
推荐该条目
保存到收藏夹
查看访问统计
导出为Endnote文件
谷歌学术
谷歌学术中相似的文章
[Lingxiao Song]的文章
[Man Zhang]的文章
[Zhenan Sun]的文章
百度学术
百度学术中相似的文章
[Lingxiao Song]的文章
[Man Zhang]的文章
[Zhenan Sun]的文章
必应学术
必应学术中相似的文章
[Lingxiao Song]的文章
[Man Zhang]的文章
[Zhenan Sun]的文章
相关权益政策
暂无数据
收藏/分享
文件名: Two-step Greedy Subspace Clustering.pdf
格式: Adobe PDF
此文件暂不支持浏览
所有评论 (0)
暂无评论
 

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。