CASIA OpenIR
(本次检索基于用户作品认领结果)

浏览/检索结果: 共7条,第1-7条 帮助

限定条件            
已选(0)清除 条数/页:   排序方式:
Joint Modeling of Document and Label with Clause Interaction Hypergraph for ICD Medical Code Assignment 会议论文
, Padua, Italy, 18-23 July 2022
作者:  Wu HR(吴浩然);  Meng LH(孟令辉);  Xu S(徐爽);  Xu B(徐波)
Adobe PDF(612Kb)  |  收藏  |  浏览/下载:104/41  |  提交时间:2023/06/26
Knowledge Aware Emotion Recognition in Textual Conversations via Multi-Task Incremental Transformer 会议论文
, Barcelona, Spain (Online), 2020-12
作者:  Zhang, Duzhen;  Chen, Xiuyi;  Xu, Shuang;  Xu, Bo
Adobe PDF(1596Kb)  |  收藏  |  浏览/下载:307/117  |  提交时间:2022/06/27
Bridging the Gap between Prior and Posterior Knowledge Selection for Knowledge-Grounded Dialogue Generation 会议论文
, online, 2020-11
作者:  Chen, Xiuyi;  Meng, Fandong;  Li, Peng;  Chen, Feilong;  Xu, Shuang;  Xu, Bo;  Zhou, Jie
Adobe PDF(389Kb)  |  收藏  |  浏览/下载:212/65  |  提交时间:2022/06/27
Multidimensional Residual Learning Based on Recurrent Neural Networks for Acoustic Modeling 会议论文
, San Francisco, USA, September 8-12
作者:  Zhao, Yuanyuan;  Xu, Shuang;  Xu, Bo;  Yuanyuan Zhao
收藏  |  浏览/下载:74/0  |  提交时间:2020/10/27
Acoustic Modeling  Multidimensional Residual Learning  Long Short-term Memory Block  Row Convolution Layer  
Multilingual Recurrent Neural Networks with Residual Learning for Low-Resource Speech Recognition 会议论文
Interspeech, Stockholm, 2017
作者:  Shiyu Zhou;  Yuanyuan Zhao;  Shuang Xu;  Bo Xu
收藏  |  浏览/下载:91/0  |  提交时间:2020/10/27
Lstm  Multilingual Speech Recognition  Low-resource  Residual Learning  Shared-hidden-layer  
Cbldnn-based Speaker-independent Speech Separation Via Generative Adversarial Training 会议论文
, Calgary, 2020-4
作者:  Li, Chenxing;  Zhu, Lei;  Xu, Shuang;  Gao, Peng;  Xu, Bo
浏览  |  Adobe PDF(791Kb)  |  收藏  |  浏览/下载:236/83  |  提交时间:2020/07/21
Speech-Transformer: A No-Recurrence Sequence-to-Sequence Model for Speech Recognition 会议论文
, Calgary, Canada, 2018-04
作者:  Dong, Linhao;  Xu, Shuang;  Xu, Bo
浏览  |  Adobe PDF(640Kb)  |  收藏  |  浏览/下载:868/493  |  提交时间:2020/06/13
speech recognition  sequence-to-sequence  attention  transformer