CASIA OpenIR

浏览/检索结果: 共6条,第1-6条 帮助

限定条件                    
已选(0)清除 条数/页:   排序方式:
A Comparison of Modeling Units in Sequence-to-Sequence Speech Recognition with the Transformer on Mandarin Chinese 会议论文
ICONIP, Siem Reap, Cambodia, 2018
作者:  Shiyu Zhou;  Linhao Dong;  Shuang Xu;  Bo Xu
收藏  |  浏览/下载:82/0  |  提交时间:2020/10/27
Asr  Multi-head Attention  Modeling Units  Sequence-to-sequence  Transformer  
Syllable-Based Sequence-to-Sequence Speech Recognition with the Transformer in Mandarin Chinese 会议论文
Interspeech, 印度的海德拉巴, 2018
作者:  Shiyu Zhou;  Linhao Dong;  Shuang Xu;  Bo Xu
收藏  |  浏览/下载:89/0  |  提交时间:2020/10/27
Asr  Multi-head Attention  Syllable Based Acoustic Modeling  Sequence-to-sequence  
Self-Attention Based Network for Punctuation Restoration 会议论文
, In Beijing, China, August 20th-24th 2018
作者:  Feng Wang;  Wei Chen;  Zhen Yang;  Bo Xu
收藏  |  浏览/下载:25/0  |  提交时间:2020/10/27
Punctuation Restoration  Self-attention  
Concept Learning through Deep Reinforcement Learning with Memory-Augmented Neural Networks 期刊论文
NEURAL NETWORKS, 2018, 期号: 1, 页码: 1-27
作者:  Shi, Jing;  Xu, Jiaming;  Yao, Yiqun;  Xu, Bo
收藏  |  浏览/下载:46/0  |  提交时间:2020/10/27
One-shot Learning  Memory  Attention  Deep Reinforcement  
Distant supervision for relation extraction with hierarchical selective attention 期刊论文
NEURAL NETWORKS, 2018, 卷号: 108, 页码: 240-247
作者:  Zhou, Peng;  Xu, Jiaming;  Qi, Zhenyu;  Bao, Hongyun;  Chen, Zhineng;  Xu, Bo
收藏  |  浏览/下载:128/0  |  提交时间:2020/10/27
Relation extraction  Distant supervision  Hierarchical attention  Piecewise convolutional neural networks  
Speech-Transformer: A No-Recurrence Sequence-to-Sequence Model for Speech Recognition 会议论文
, Calgary, Canada, 2018-04
作者:  Dong, Linhao;  Xu, Shuang;  Xu, Bo
浏览  |  Adobe PDF(640Kb)  |  收藏  |  浏览/下载:818/483  |  提交时间:2020/06/13
speech recognition  sequence-to-sequence  attention  transformer