CASIA OpenIR  > 类脑智能研究中心
GIFT: Towards Scalable 3D Shape Retrieval
Bai, Song1; Bai, Xiang1; Zhou, Zhichao1; Zhang, Zhaoxiang2; Tian, Qi3; Latecki, Longin Jan4
AbstractProjective analysis is an important solution in three-dimensional (3D) shape retrieval, since human visual perceptions of 3D shapes rely on various 2D observations from different viewpoints. Although multiple informative and discriminative views are utilized, most projection-based retrieval systems suffer from heavy computational cost, and thus cannot satisfy the basic requirement of scalability for search engines. In the past three years, shape retrieval contest (SHREC) pays much attention to the scalability of 3D shape retrieval algorithms, and organizes several large scale tracks accordingly [1]-[3]. However, the experimental results indicate that conventional algorithms cannot be directly applied to large datasets. In this paper, we present a real-time 3D shape search engine based on the projective images of 3D shapes. The real-time property of our search engine results from the following aspects: 1) efficient projection and view feature extraction using GPU acceleration; 2) the first inverted file, called F-IF, is utilized to speed up the procedure of multiview matching; and 3) the second inverted file, which captures a local distribution of 3D shapes in the feature manifold, is adopted for efficient context-based reranking. As a result, for each query the retrieval task can be finished within one second despite the necessary cost of IO overhead. We name the proposed 3D shape search engine, which combines GPU acceleration and inverted file (twice), as GIFT. Besides its high efficiency, GIFT also outperforms state-of-the-art methods significantly in retrieval accuracy on various shape benchmarks (ModelNet40 dataset, ModelNet10 dataset, PSB dataset, McGill dataset) and competitions (SHREC14LSGTB, ShapeNet Core55, WM-SHREC07).
Keyword3d Shape Retrieval Cnn Shape Retrieval Contest (Shrec)
WOS HeadingsScience & Technology ; Technology
Indexed BySCI
Funding OrganizationNational Natural Science Foundation of China(61231010 ; China Scholarship Council ; National Science Foundation(IIS-1302164) ; ARO(W911NF-15-1-0290) ; Faculty Research Gift Awards by NEC Laboratories of America and Blippar ; 61573160 ; 61429201)
WOS Research AreaComputer Science ; Telecommunications
WOS SubjectComputer Science, Information Systems ; Computer Science, Software Engineering ; Telecommunications
WOS IDWOS:000404059400012
Citation statistics
Cited Times:9[WOS]   [WOS Record]     [Related Records in WOS]
Document Type期刊论文
Affiliation1.Huazhong Univ Sci & Technol, Sch Elect Informat & Commun, Wuhan 430074, Peoples R China
2.Chinese Acad Sci, Inst Automat, Ctr Brain Inspired Intelligence, CAS Ctr Excellence Brain Sci & Intelligence Techn, Beijing 100190, Peoples R China
3.Univ Texas San Antonio, Dept Comp Sci, San Antonio, TX 78249 USA
4.Temple Univ, Dept Comp & Informat Sci, Philadelphia, PA 19122 USA
Recommended Citation
GB/T 7714
Bai, Song,Bai, Xiang,Zhou, Zhichao,et al. GIFT: Towards Scalable 3D Shape Retrieval[J]. IEEE TRANSACTIONS ON MULTIMEDIA,2017,19(6):1257-1271.
APA Bai, Song,Bai, Xiang,Zhou, Zhichao,Zhang, Zhaoxiang,Tian, Qi,&Latecki, Longin Jan.(2017).GIFT: Towards Scalable 3D Shape Retrieval.IEEE TRANSACTIONS ON MULTIMEDIA,19(6),1257-1271.
MLA Bai, Song,et al."GIFT: Towards Scalable 3D Shape Retrieval".IEEE TRANSACTIONS ON MULTIMEDIA 19.6(2017):1257-1271.
Files in This Item:
There are no files associated with this item.
Related Services
Recommend this item
Usage statistics
Export to Endnote
Google Scholar
Similar articles in Google Scholar
[Bai, Song]'s Articles
[Bai, Xiang]'s Articles
[Zhou, Zhichao]'s Articles
Baidu academic
Similar articles in Baidu academic
[Bai, Song]'s Articles
[Bai, Xiang]'s Articles
[Zhou, Zhichao]'s Articles
Bing Scholar
Similar articles in Bing Scholar
[Bai, Song]'s Articles
[Bai, Xiang]'s Articles
[Zhou, Zhichao]'s Articles
Terms of Use
No data!
Social Bookmark/Share
All comments (0)
No comment.

Items in the repository are protected by copyright, with all rights reserved, unless otherwise indicated.