CASIA OpenIR
(本次检索基于用户作品认领结果)

浏览/检索结果: 共6条,第1-6条 帮助

限定条件            
已选(0)清除 条数/页:   排序方式:
Personalized graph neural networks with attention mechanism for session-aware recommendation 期刊论文
IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2020, 卷号: 34, 期号: 8, 页码: 3946-3957
作者:  Mengqi Zhang;  Shu Wu;  Meng Gao;  Xin Jiang;  Ke Xu;  Liang Wang
Adobe PDF(1277Kb)  |  收藏  |  浏览/下载:118/45  |  提交时间:2023/07/03
Relational graph neural network for situation recognition 期刊论文
Pattern Recognition, 2020, 期号: 108, 页码: 107544
作者:  Jing Y(荆雅);  Wang JB(王君波);  Wang W(王威);  Wang L(王亮);  Tan TN(谭铁牛)
Adobe PDF(3098Kb)  |  收藏  |  浏览/下载:175/39  |  提交时间:2021/06/07
Situation recognition  Relationship modeling  Graph neural network  Reinforcement learning  
Improving Description-Based Person Re-Identification by Multi-Granularity Image-Text Alignments 期刊论文
IEEE Transactions on Image Processing, 2020, 卷号: 29, 期号: 1, 页码: 15
作者:  Niu, Kai;  Huang, Yan;  Ouyang, Wanli;  Wang, Liang
浏览  |  Adobe PDF(5193Kb)  |  收藏  |  浏览/下载:183/65  |  提交时间:2020/10/09
Description-based person re-identification  Multi-granularity image-text alignments  Step training strategy  
Re-ranking Image-text Matching by Adaptive Metric Fusion 期刊论文
PATTERN RECOGNITION, 2020, 卷号: 104, 期号: 1, 页码: 13
作者:  Niu, Kai;  Huang, Yan;  Wang, Liang
浏览  |  Adobe PDF(2236Kb)  |  收藏  |  浏览/下载:454/103  |  提交时间:2020/06/22
Image-text matching  Re-ranking method  Adaptive metric fusion  
Long video question answering: A Matching-guided Attention Model 期刊论文
PATTERN RECOGNITION, 2020, 卷号: 102, 期号: 1, 页码: 11
作者:  Wang, Weining;  Huang, Yan;  Wang, Liang
Adobe PDF(1963Kb)  |  收藏  |  浏览/下载:358/69  |  提交时间:2020/06/02
Long video QA  Matching-guided attention  
Learning visual relationship and context-aware attention for image captioning 期刊论文
Pattern Recognition, 2020, 期号: 98, 页码: 107075
作者:  Wang, Junbo;  Wang, Wei;  Wang, Liang;  Wang, Zhiyong;  Feng, Dagan;  Tan Tieniu
浏览  |  Adobe PDF(2059Kb)  |  收藏  |  浏览/下载:519/231  |  提交时间:2020/01/07
Image captioning  Relational reasoning  Context-aware attention