Knowledge Commons of Institute of Automation,CAS
G-Head: Gating Head for Multi-task Learning in One-stage Object Detection | |
He, Jiang1,2; Qingyi, Gu1 | |
2022-03 | |
会议名称 | 2022 IEEE International Conference on Multimedia & Expo (ICME) |
会议日期 | 2022-7 |
会议地点 | Taiwan |
摘要 | Object detection is commonly formulated as a multi-task learning problem in deep learning methods. Due to the divergence between classification and regression tasks, modern one-stage detectors typically utilize two parallel branches as the detection head, which might be sub-optimal. In this paper, we propose a new Gating Head (G-Head) to enhance the interaction between different tasks and promote the multi-task learning process. By introducing Multi-Scale Aggregation (MSA), Multi-Aspect Learning (MAL), and Gating Selector (GS), our method can signifificantly boost the performance of existing one-stage frameworks with fewer parameters and computational costs. To validate the effificiency, effectiveness, and generalization of our G-Head, extensive experiments are conducted on the challenging MS COCO dataset. Without bells and whistles, we achieve a new state-of-the-art 48.7 AP under single-model and single-scale test. |
收录类别 | EI |
语种 | 英语 |
文献类型 | 会议论文 |
条目标识符 | http://ir.ia.ac.cn/handle/173211/48641 |
专题 | 中科院工业视觉智能装备工程实验室_精密感知与控制 中国科学院自动化研究所 |
通讯作者 | Qingyi, Gu |
作者单位 | 1.Institute of Automation, Chinese Academy of Sciences 2.School of Artificial Intelligence, University of Chinese Academy of Sciences |
第一作者单位 | 中国科学院自动化研究所 |
通讯作者单位 | 中国科学院自动化研究所 |
推荐引用方式 GB/T 7714 | He, Jiang,Qingyi, Gu. G-Head: Gating Head for Multi-task Learning in One-stage Object Detection[C],2022. |
条目包含的文件 | 下载所有文件 | |||||
文件名称/大小 | 文献类型 | 版本类型 | 开放类型 | 使用许可 | ||
GHead.pdf(485KB) | 会议论文 | 开放获取 | CC BY-NC-SA | 浏览 下载 |
个性服务 |
推荐该条目 |
保存到收藏夹 |
查看访问统计 |
导出为Endnote文件 |
谷歌学术 |
谷歌学术中相似的文章 |
[He, Jiang]的文章 |
[Qingyi, Gu]的文章 |
百度学术 |
百度学术中相似的文章 |
[He, Jiang]的文章 |
[Qingyi, Gu]的文章 |
必应学术 |
必应学术中相似的文章 |
[He, Jiang]的文章 |
[Qingyi, Gu]的文章 |
相关权益政策 |
暂无数据 |
收藏/分享 |
除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。
修改评论