CASIA OpenIR

浏览/检索结果: 共5条,第1-5条 帮助

限定条件                        
已选(0)清除 条数/页:   排序方式:
Multi-Correlation Siamese Transformer Network With Dense Connection for 3D Single Object Tracking 期刊论文
IEEE ROBOTICS AND AUTOMATION LETTERS, 2023, 卷号: 8, 期号: 12, 页码: 8066-8073
作者:  Feng, Shihao;  Liang, Pengpeng;  Gao, Jin;  Cheng, Erkang
Adobe PDF(2745Kb)  |  收藏  |  浏览/下载:113/4  |  提交时间:2023/12/21
3D object tracking  Point cloud  Transformer  
Self-Prior Guided Pixel Adversarial Networks for Blind Image Inpainting 期刊论文
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2023, 卷号: 45, 期号: 10, 页码: 12377-12393
作者:  Wang, Juan;  Yuan, Chunfeng;  Li, Bing;  Deng, Ying;  Hu, Weiming;  Maybank, Stephen
收藏  |  浏览/下载:147/0  |  提交时间:2023/11/16
Blind image inpainting  semantic-discontinuity detection  layout map prediction  pixel generative adversarial network  
Cross-Architecture Knowledge Distillation 会议论文
INTERNATIONAL JOURNAL OF COMPUTER VISION, Macau SAR, China, 2022.12.4-2022.12.8
作者:  Yufan Liu;  Jiajiong Cao;  Bing Li;  Weiming Hu;  Jingting Ding;  Liang Li
Adobe PDF(1020Kb)  |  收藏  |  浏览/下载:159/46  |  提交时间:2023/04/23
Knowledge distillation  Cross architecture  Model compression  Deep learning  
EDP: An Efficient Decomposition and Pruning Scheme for Convolutional Neural Network Compression 期刊论文
IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2021, 卷号: 32, 期号: 10, 页码: 4499-4513
作者:  Ruan, Xiaofeng;  Liu, Yufan;  Yuan, Chunfeng;  Li, Bing;  Hu, Weiming;  Li, Yangxi;  Maybank, Stephen
Adobe PDF(3625Kb)  |  收藏  |  浏览/下载:339/45  |  提交时间:2021/06/17
Data-driven  low-rank decomposition  model compression and acceleration  structured pruning  
Graph convolutional network with structure pooling and joint-wise channel attention for action recognition 期刊论文
PATTERN RECOGNITION, 2020, 卷号: 103, 页码: 12
作者:  Chen, Yuxin;  Ma, Gaoqun;  Yuan, Chunfeng;  Li, Bing;  Zhang, Hui;  Wang, Fangshi;  Hu, Weiming
Adobe PDF(2455Kb)  |  收藏  |  浏览/下载:273/1  |  提交时间:2020/06/22
Graph convolutional network  Structure graph pooling  Joint-wise channel attention