Multilingual Tandem Bottleneck Feature For Language Identification
Wang Geng; Jie Li,; Shanshan Zhang; Xinyuan Cai; Bo Xu; Xinyuan.Cai
2016-09
会议名称Interspeech 2015
会议录名称Interspeech2015
会议日期2016.9.6-2016.9.10
会议地点Dresden,German
摘要The deep bottleneck (BN) feature based ivector solution has
been recognized as a popular pipeline for language identification
(LID) recently. However, issues such as how to extract
more effective BN features and how to fully utilize features
extracted from deep neural networks (DNN) are still not well
investigated. In this paper, these issues are empirically tackled
by means as follows: First, two novel types of deep features,
phone-discriminant and triphone-discriminate are extracted.
Then, DNNs are trained both separately and jointly on multilingual
corpuses to produce different BN features. Finally, tandem
fashion on deep BN features is applied to build enhanced
deep features. Experiment results show that systems built on
top of tandem deep features obtain 19% and 42% relative equal
error rate reduction on average on NIST LRE 2007 over the
counterpart built on traditional deep BN features and the cepstral
feature based LID system, respectively
关键词Language Identification Deep Bottleneck Feature Tandem Feature Multi-deep Feature Multi-training Procedure.
文献类型会议论文
条目标识符http://ir.ia.ac.cn/handle/173211/12485
专题数字内容技术与服务研究中心_听觉模型与认知计算
通讯作者Xinyuan.Cai
作者单位Institute of Automation Chinese Academy of Science
推荐引用方式
GB/T 7714
Wang Geng,Jie Li,,Shanshan Zhang,et al. Multilingual Tandem Bottleneck Feature For Language Identification[C],2016.
条目包含的文件 下载所有文件
文件名称/大小 文献类型 版本类型 开放类型 使用许可
Multilingual Tandem (529KB)会议论文 开放获取CC BY-NC-SA浏览 下载
个性服务
推荐该条目
保存到收藏夹
查看访问统计
导出为Endnote文件
谷歌学术
谷歌学术中相似的文章
[Wang Geng]的文章
[Jie Li,]的文章
[Shanshan Zhang]的文章
百度学术
百度学术中相似的文章
[Wang Geng]的文章
[Jie Li,]的文章
[Shanshan Zhang]的文章
必应学术
必应学术中相似的文章
[Wang Geng]的文章
[Jie Li,]的文章
[Shanshan Zhang]的文章
相关权益政策
暂无数据
收藏/分享
文件名: Multilingual Tandem Bottleneck Feature For Language Identification.pdf
格式: Adobe PDF
所有评论 (0)
暂无评论
 

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。