Knowledge Commons of Institute of Automation,CAS
Multi-scale self-attention-based feature enhancement for detection of targets with small image sizes | |
Deng, Ying1,2; Hu, Xingliang3; Li, Bing3![]() ![]() | |
Source Publication | PATTERN RECOGNITION LETTERS
![]() |
ISSN | 0167-8655 |
2023-02-01 | |
Volume | 166Pages:46-52 |
Corresponding Author | Li, Bing(bli@nlpr.ia.ac.cn) |
Abstract | In this paper, we propose a feature enhancement method based on multi-scale self-attention, mainly including a multi-scale feature combination module and a self-attention module. The multi-scale feature combination module integrates the multi-layers' features extracted from the backbone network in both the top-down and bottom-up directions. Then, the shallow and deep features are combined. The self-attention module enhances the feature representation by assigning attention weights to the features that have intrinsic connection to the features of the target. The multi-scale self-attention-based feature enhancement method improves the performance for detecting targets with small image sizes in complex scenes by mutual combination between deep and shallow features and between local and global features. The experimental results show the effectiveness of the proposed feature enhancement method. (c) 2023 Elsevier B.V. All rights reserved. |
Keyword | Detection of targets with small image sizes Feature enhancement Multi-scale combination Self-attention |
DOI | 10.1016/j.patrec.2022.12.026 |
Indexed By | SCI |
Language | 英语 |
Funding Project | national key R&D program of china[2018AAA0102802] ; Natural Science Foundation of China[62036011] ; Natural Science Foundation of China[62192782] ; Natural Science Foundation of China[61721004] ; Natural Science Foundation of China[U2033210] ; Beijing Natural Science Foundation[L223003] ; Major Projects of Guangdong Education Department for Foundation Research and Applied Research[2017KZDXM081] ; Major Projects of Guangdong Education Department for Foundation Research and Applied Research[2018KZDXM0 6 6] ; Guangdong Provincial University Innovation Team Project[2020KCXTD045] |
Funding Organization | national key R&D program of china ; Natural Science Foundation of China ; Beijing Natural Science Foundation ; Major Projects of Guangdong Education Department for Foundation Research and Applied Research ; Guangdong Provincial University Innovation Team Project |
WOS Research Area | Computer Science |
WOS Subject | Computer Science, Artificial Intelligence |
WOS ID | WOS:000925102000001 |
Publisher | ELSEVIER |
Citation statistics | |
Document Type | 期刊论文 |
Identifier | http://ir.ia.ac.cn/handle/173211/51441 |
Collection | 多模态人工智能系统全国重点实验室 |
Corresponding Author | Li, Bing |
Affiliation | 1.Nanjing Univ Aeronaut & Astronaut, Coll Aerosp Engn, Nanjing 210016, Peoples R China 2.Nanchang Hangkong Univ, Sch Aeronaut Mfg Engn, Nanchang 330063, Peoples R China 3.Chinese Acad Sci, Inst Automat, Natl Lab Pattern Recognit, Beijing 100190, Peoples R China 4.Univ Chinese Acad Sci, Sch Artificial Intelligence, Beijing 100049, Peoples R China 5.ShanghaiTech Univ, Sch Informat Sci & Technol, Shanghai 201210, Peoples R China |
Corresponding Author Affilication | Chinese Acad Sci, Inst Automat, Natl Lab Pattern Recognit, Beijing 100190, Peoples R China |
Recommended Citation GB/T 7714 | Deng, Ying,Hu, Xingliang,Li, Bing,et al. Multi-scale self-attention-based feature enhancement for detection of targets with small image sizes[J]. PATTERN RECOGNITION LETTERS,2023,166:46-52. |
APA | Deng, Ying,Hu, Xingliang,Li, Bing,Zhang, Congxuan,&Hu, Weiming.(2023).Multi-scale self-attention-based feature enhancement for detection of targets with small image sizes.PATTERN RECOGNITION LETTERS,166,46-52. |
MLA | Deng, Ying,et al."Multi-scale self-attention-based feature enhancement for detection of targets with small image sizes".PATTERN RECOGNITION LETTERS 166(2023):46-52. |
Files in This Item: | There are no files associated with this item. |
Items in the repository are protected by copyright, with all rights reserved, unless otherwise indicated.
Edit Comment