CASIA OpenIR  > 模式识别国家重点实验室  > 先进时空数据分析与学习
PSNet: Perspective-sensitive convolutional network for object detection
Zhang, Xin1,2; Liu, Yicheng3; Huo, Chunlei1,2; Xu, Nuo1,2; Wang, Lingfeng1,4; Pan, Chunhong1,2
Source PublicationNEUROCOMPUTING
ISSN0925-2312
2022-01-11
Volume468Pages:384-395
Abstract

Multi-view object detection is challenging due to the influence of the different view-angles on intra-class similarity. The uniformed feature representation of traditional detectors couples the object's perspective attribute and semantic feature, and the variances of perspective will cause intra-class differences. In this paper, a robust perspective-sensitive network (PSNet) is proposed to overcome the above problem. The uniformed feature is replaced by the perspective-specific structural feature, which makes the network perspective sensitive. Its essence is to learn multiple perspective spaces. In each perspective space, the semantic feature is decoupled from the perspective attribute and is robust to perspective variances. Perspective-sensitive RoI pooling and loss function are proposed for perspective-sensitive learning. Experiments on Pascal3D + and SpaceNet MOVI show the effectiveness and superiority of the PSNet. (c) 2021 Elsevier B.V. All rights reserved.

KeywordObject detection Perspective-sensitive Structural neural network
DOI10.1016/j.neucom.2021.10.068
WOS KeywordCHALLENGE
Indexed BySCI
Language英语
Funding ProjectNational Natural Science Foundation of China[62071466] ; National Natural Science Foundation of China[62076242] ; National Natural Science Foundation of China[61976208] ; National Key Research and Development Program of China[2018AAA0100400]
Funding OrganizationNational Natural Science Foundation of China ; National Key Research and Development Program of China
WOS Research AreaComputer Science
WOS SubjectComputer Science, Artificial Intelligence
WOS IDWOS:000719336800002
PublisherELSEVIER
Sub direction classification目标检测、跟踪与识别
Citation statistics
Cited Times:8[WOS]   [WOS Record]     [Related Records in WOS]
Document Type期刊论文
Identifierhttp://ir.ia.ac.cn/handle/173211/46468
Collection模式识别国家重点实验室_先进时空数据分析与学习
Corresponding AuthorHuo, Chunlei
Affiliation1.Chinese Acad Sci, Inst Automat, Natl Lab Pattern Recognit, Beijing 100190, Peoples R China
2.Univ Chinese Acad Sci, Sch Artificial Intelligence, Beijing 101408, Peoples R China
3.Chinese Univ Hong Kong, Hong Kong 999077, Peoples R China
4.Beijing Univ Chem Technol, Coll Informat Sci & Technol, Beijing 100029, Peoples R China
First Author AffilicationChinese Acad Sci, Inst Automat, Natl Lab Pattern Recognit, Beijing 100190, Peoples R China
Corresponding Author AffilicationChinese Acad Sci, Inst Automat, Natl Lab Pattern Recognit, Beijing 100190, Peoples R China
Recommended Citation
GB/T 7714
Zhang, Xin,Liu, Yicheng,Huo, Chunlei,et al. PSNet: Perspective-sensitive convolutional network for object detection[J]. NEUROCOMPUTING,2022,468:384-395.
APA Zhang, Xin,Liu, Yicheng,Huo, Chunlei,Xu, Nuo,Wang, Lingfeng,&Pan, Chunhong.(2022).PSNet: Perspective-sensitive convolutional network for object detection.NEUROCOMPUTING,468,384-395.
MLA Zhang, Xin,et al."PSNet: Perspective-sensitive convolutional network for object detection".NEUROCOMPUTING 468(2022):384-395.
Files in This Item: Download All
File Name/Size DocType Version Access License
PSNet__Perspective_S(8656KB)期刊论文作者接受稿开放获取CC BY-NC-SAView Download
Related Services
Recommend this item
Bookmark
Usage statistics
Export to Endnote
Google Scholar
Similar articles in Google Scholar
[Zhang, Xin]'s Articles
[Liu, Yicheng]'s Articles
[Huo, Chunlei]'s Articles
Baidu academic
Similar articles in Baidu academic
[Zhang, Xin]'s Articles
[Liu, Yicheng]'s Articles
[Huo, Chunlei]'s Articles
Bing Scholar
Similar articles in Bing Scholar
[Zhang, Xin]'s Articles
[Liu, Yicheng]'s Articles
[Huo, Chunlei]'s Articles
Terms of Use
No data!
Social Bookmark/Share
File name: PSNet__Perspective_Sensitive_Convolutional_Network_for_Object_Detection_NeroComputing_final_.pdf
Format: Adobe PDF
All comments (0)
No comment.
 

Items in the repository are protected by copyright, with all rights reserved, unless otherwise indicated.