ASYNCHRONOUS STOCHASTIC GRADIENT DESCENT FOR DNN TRAINING
Shanshan, Zhang; Ce, Zhang; Zhao, You; Rong, Zheng; Bo, Xu
2013
会议名称2013 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP)
会议录名称IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)
会议日期2013
会议地点Vancouver, Canada
摘要
; It is well known that state-of-the-art speech recognition systems
using deep neural network (DNN) can greatly improve the system
performance compared with conventional GMM-HMM. However,
what we have to pay correspondingly is the immense training cost
due to the enormous parameters of DNN. Unfortunately, it is difficult
to achieve parallelization of the minibatch-based back-propagation
(BP) algorithm used in DNN training because of the frequent model
updates.
In this paper we describe an effective approach to achieve an
approximation of BP — asynchronous stochastic gradient descent
(ASGD), which is used to parallelize computing on multi-GPU. This
approach manages multiple GPUs to work asynchronously to calculate
gradients and update the global model parameters. Experimental
results show that it achieves a 3.2 times speed-up on 4 GPUs than the
single one, without any recognition performance loss.
关键词Deep Neural Network Speech Recognition Asynchronous Sgd Gpu Parallelization
收录类别EI
语种英语
文献类型会议论文
条目标识符http://ir.ia.ac.cn/handle/173211/11808
专题数字内容技术与服务研究中心_听觉模型与认知计算
通讯作者Shanshan, Zhang
作者单位Interactive Digital Media Technology Research Center Institute of Automation, Chinese Academy of Sciences
推荐引用方式
GB/T 7714
Shanshan, Zhang,Ce, Zhang,Zhao, You,et al. ASYNCHRONOUS STOCHASTIC GRADIENT DESCENT FOR DNN TRAINING[C],2013.
条目包含的文件 下载所有文件
文件名称/大小 文献类型 版本类型 开放类型 使用许可
20121201085547_33344(260KB)会议论文 开放获取CC BY-NC-SA浏览 下载
个性服务
推荐该条目
保存到收藏夹
查看访问统计
导出为Endnote文件
谷歌学术
谷歌学术中相似的文章
[Shanshan, Zhang]的文章
[Ce, Zhang]的文章
[Zhao, You]的文章
百度学术
百度学术中相似的文章
[Shanshan, Zhang]的文章
[Ce, Zhang]的文章
[Zhao, You]的文章
必应学术
必应学术中相似的文章
[Shanshan, Zhang]的文章
[Ce, Zhang]的文章
[Zhao, You]的文章
相关权益政策
暂无数据
收藏/分享
文件名: 20121201085547_333442_3735.pdf
格式: Adobe PDF
所有评论 (0)
暂无评论
 

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。