Gating Recurrent Enhanced Memory Neural Networks on Language Identification
Wang Geng; Yuanyan Zhao; Wenfu Wang; Cai Xinyuan; Bo Xu; Xinyuan Cai
2016-09
会议名称InterSpeech2016
会议录名称InterSpeech 2016
会议日期2016.9.8-2016.9.12
会议地点San Francisco, USA
摘要This paper proposes a novel memory neural network structure,
namely gating recurrent enhanced memory network (GREMN),
to model long-range dependency in temporal series on language
identification (LID) task at the acoustic frame level. The proposed
GREMN is a stacking gating recurrent neural network
(RNN) equipped with a learnable enhanced memory block near
the classifier. It aims at capturing the long-span history and
certain future contextual information of the sequential input. In
addition, two optimization strategies of coherent SortaGrad-like
training mechanism and a hard sample score acquisition approach
are proposed. The proposed optimization policies drastically
boost this memory network based LID system, especially
on the large disparity training materials. It is confirmed by
the experimental results that the proposed GREMN possesses
strong ability of sequential modeling and generalization, where
about 5% relative equal error rate (EER) reduction is obtained
comparing with the approximate-sized gating RNNs and 38.5%
performance improvements is observed compared to conventional
i-Vector based LID system.
关键词Language Identification Gating Recurrent Neural Networks Learnable Enhanced Memory Block Sortagrad-like Training Approach Hard Sample Score Acquisition
文献类型会议论文
条目标识符http://ir.ia.ac.cn/handle/173211/41094
专题复杂系统认知与决策实验室_听觉模型与认知计算
通讯作者Xinyuan Cai
推荐引用方式
GB/T 7714
Wang Geng,Yuanyan Zhao,Wenfu Wang,et al. Gating Recurrent Enhanced Memory Neural Networks on Language Identification[C],2016.
条目包含的文件
条目无相关文件。
个性服务
推荐该条目
保存到收藏夹
查看访问统计
导出为Endnote文件
谷歌学术
谷歌学术中相似的文章
[Wang Geng]的文章
[Yuanyan Zhao]的文章
[Wenfu Wang]的文章
百度学术
百度学术中相似的文章
[Wang Geng]的文章
[Yuanyan Zhao]的文章
[Wenfu Wang]的文章
必应学术
必应学术中相似的文章
[Wang Geng]的文章
[Yuanyan Zhao]的文章
[Wenfu Wang]的文章
相关权益政策
暂无数据
收藏/分享
所有评论 (0)
暂无评论
 

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。