CASIA OpenIR  > 数字内容技术与服务研究中心  > 听觉模型与认知计算
Gating Recurrent Enhanced Memory Neural Networks on Language Identification
Wang Geng; Yuanyan Zhao; Wenfu Wang; Cai Xinyuan; Bo Xu; Xinyuan Cai
2016-09
Conference NameInterSpeech2016
Source PublicationInterSpeech 2016
Conference Date2016.9.8-2016.9.12
Conference PlaceSan Francisco, USA
AbstractThis paper proposes a novel memory neural network structure,
namely gating recurrent enhanced memory network (GREMN),
to model long-range dependency in temporal series on language
identification (LID) task at the acoustic frame level. The proposed
GREMN is a stacking gating recurrent neural network
(RNN) equipped with a learnable enhanced memory block near
the classifier. It aims at capturing the long-span history and
certain future contextual information of the sequential input. In
addition, two optimization strategies of coherent SortaGrad-like
training mechanism and a hard sample score acquisition approach
are proposed. The proposed optimization policies drastically
boost this memory network based LID system, especially
on the large disparity training materials. It is confirmed by
the experimental results that the proposed GREMN possesses
strong ability of sequential modeling and generalization, where
about 5% relative equal error rate (EER) reduction is obtained
comparing with the approximate-sized gating RNNs and 38.5%
performance improvements is observed compared to conventional
i-Vector based LID system.
KeywordLanguage Identification Gating Recurrent Neural Networks Learnable Enhanced Memory Block Sortagrad-like Training Approach Hard Sample Score Acquisition
Document Type会议论文
Identifierhttp://ir.ia.ac.cn/handle/173211/12484
Collection数字内容技术与服务研究中心_听觉模型与认知计算
Corresponding AuthorXinyuan Cai
AffiliationInstitute of Automation Chinese Academy of Sciences
Recommended Citation
GB/T 7714
Wang Geng,Yuanyan Zhao,Wenfu Wang,et al. Gating Recurrent Enhanced Memory Neural Networks on Language Identification[C],2016.
Files in This Item: Download All
File Name/Size DocType Version Access License
Gating Recurrent Enh(529KB)会议论文 开放获取CC BY-NC-SAView Download
Related Services
Recommend this item
Bookmark
Usage statistics
Export to Endnote
Google Scholar
Similar articles in Google Scholar
[Wang Geng]'s Articles
[Yuanyan Zhao]'s Articles
[Wenfu Wang]'s Articles
Baidu academic
Similar articles in Baidu academic
[Wang Geng]'s Articles
[Yuanyan Zhao]'s Articles
[Wenfu Wang]'s Articles
Bing Scholar
Similar articles in Bing Scholar
[Wang Geng]'s Articles
[Yuanyan Zhao]'s Articles
[Wenfu Wang]'s Articles
Terms of Use
No data!
Social Bookmark/Share
File name: Gating Recurrent Enhanced Memory Neural Networks on Language Identification.pdf
Format: Adobe PDF
All comments (0)
No comment.
 

Items in the repository are protected by copyright, with all rights reserved, unless otherwise indicated.