CASIA OpenIR
(本次检索基于用户作品认领结果)

浏览/检索结果: 共7条,第1-7条 帮助

限定条件        
已选(0)清除 条数/页:   排序方式:
C2AM Loss: Chasing a Better Decision Boundary for Long-Tail Object Detection 会议论文
, New Orleans, Louisiana & Online, 2022-6-19
作者:  Wang, Tong;  Zhu, Yousong;  Chen, Yingying;  Zhao, Chaoyang;  Yu, Bin;  Wang, Jinqiao;  Tang, Ming
Adobe PDF(5757Kb)  |  收藏  |  浏览/下载:360/60  |  提交时间:2022/04/01
Large Batch Optimization for Object Detection: Training COCO in 12 minutes 会议论文
, Online, 2020-8-24
作者:  Wang, Tong;  Zhu, Yousong;  Zhao, Chaoyang;  Zeng, Wei;  Wang, Yaowei;  Wang, Jinqiao;  Tang, Ming
Adobe PDF(3706Kb)  |  收藏  |  浏览/下载:244/35  |  提交时间:2022/04/01
Object detection  Large batch optimization  Periodical moments decay  
DPT: Deformable Patch-based Transformer for Visual Recognition 会议论文
, Chengdu, China, 2021-10-20
作者:  Chen,Zhiyang;  Zhu, Yousong;  Zhao,Chaoyang;  Hu, Guosheng;  Zeng, Wei;  Wang, Jinqiao;  Tang, Ming
Adobe PDF(3799Kb)  |  收藏  |  浏览/下载:186/30  |  提交时间:2022/04/01
Adaptive Class Suppression Loss for Long-Tail Object Detection 会议论文
, Online, 2021-6-19
作者:  Wang, Tong;  Zhu, Yousong;  Zhao, Chaoyang;  Zeng, Wei;  Wang, Jinqiao;  Tang, Ming
Adobe PDF(2668Kb)  |  收藏  |  浏览/下载:249/66  |  提交时间:2022/04/01
A Novel Data Augmentation Scheme for Pedestrian Detection with Attribute Preserving GAN 期刊论文
Neurocomputing, 2020, 卷号: 401, 期号: 11, 页码: 123-132
作者:  Liu, Songyan;  Guo, Haiyun;  Hu, Jian-Guo;  Zhao, Xu;  Zhao, Chaoyang;  Wang, Tong;  Zhu, Yousong;  Wang, Jinqiao;  Tang, Ming
浏览  |  Adobe PDF(2691Kb)  |  收藏  |  浏览/下载:378/83  |  提交时间:2020/06/10
Generative Adversarial Networks  Pedestrian detection  Data augmentation  
基于特征学习的目标检测技术研究 学位论文
工学博士, 中国科学院自动化研究所: 中国科学院大学, 2019
作者:  朱优松
Adobe PDF(8332Kb)  |  收藏  |  浏览/下载:533/29  |  提交时间:2019/06/05
目标检测  特征学习  卷积神经网络  深度学习  
Mask Guided Knowledge Distillation for Single Shot Detector 会议论文
, Shanghai, China, 2019-7-8
作者:  Zhu Yousong;  Zhao Chaoyang;  Han Chenxia;  Wang Jinqiao;  Lu Hanqing
浏览  |  Adobe PDF(1512Kb)  |  收藏  |  浏览/下载:729/320  |  提交时间:2019/05/04
Object Detection  Knowledge Distillation