CASIA OpenIR  > 智能感知与计算研究中心
Local Hypersphere Coding Based on Edges between Visual Words
Weiqiang Ren; Yongzhen Huang; Kaiqi Huang
2012
Conference NameACCV2012
Source PublicationSpringer Berlin Heidelberg, 2012
Pages190-213
Conference Date2012
Conference PlaceChina
AbstractLocal feature coding has drawn much attention in recent years. Many excellent coding algorithms have been proposed to improve the bag-of-words model. This paper proposes a new local feature coding method called local hypersphere coding (LHC) which possesses two distinctive differences from traditional coding methods. Firstly, we describe local features by the edges between visual words. Secondly, the reconstruction center is moved from the origin to the nearest visual word, thus feature coding is performed on the hypersphere of feature space. We evaluate our coding method on several benchmark datasets for image classification. The experimental results of the proposed method outperform several state-of-the-art coding methods, indicating the effectiveness of our method.
KeywordLocal Hypersphere Coding
Language英语
Document Type会议论文
Identifierhttp://ir.ia.ac.cn/handle/173211/12691
Collection智能感知与计算研究中心
Corresponding AuthorKaiqi Huang
Affiliation中国科学院自动化研究所
Recommended Citation
GB/T 7714
Weiqiang Ren,Yongzhen Huang,Kaiqi Huang. Local Hypersphere Coding Based on Edges between Visual Words[C],2012:190-213.
Files in This Item:
There are no files associated with this item.
Related Services
Recommend this item
Bookmark
Usage statistics
Export to Endnote
Google Scholar
Similar articles in Google Scholar
[Weiqiang Ren]'s Articles
[Yongzhen Huang]'s Articles
[Kaiqi Huang]'s Articles
Baidu academic
Similar articles in Baidu academic
[Weiqiang Ren]'s Articles
[Yongzhen Huang]'s Articles
[Kaiqi Huang]'s Articles
Bing Scholar
Similar articles in Bing Scholar
[Weiqiang Ren]'s Articles
[Yongzhen Huang]'s Articles
[Kaiqi Huang]'s Articles
Terms of Use
No data!
Social Bookmark/Share
All comments (0)
No comment.
 

Items in the repository are protected by copyright, with all rights reserved, unless otherwise indicated.