CASIA OpenIR  > 模式识别国家重点实验室  > 机器人视觉
Improving Deep Neural Networks by Using Sparse Dropout Strategy
Zheng Hao; Mingming Chen; Wenju Liu; Zhanlei Yang; Shan Liang; Hao Zheng
2014
会议名称ChinaSIP
会议录名称ChinaSIP
会议日期2014
会议地点Xi an, Shanxi, China
摘要Recently, deep neural networks(DNNs) have achieved excelleng results on benchmarks for acoustic modeling of speech recognition. By randomly discarding network units, a strategy which is called as dropout can improve the performance of DNNs by reducing the influence of over-fitting. However, the random dropout strategy treats units indiscriminately, which may lose information on distributions of units outputs. In this paper, we improve the dropout strategy by differential treatment to units according to their outputs. Only minor changes to an existing neural network system can achieve a significant improvement. Experiments of phone recognition on TIMIT show that the sparse dropout fine-tuning gets significant performance improvement.; Recently, deep neural networks(DNNs) have achieved excelleng results on benchmarks for acoustic modeling of speech recognition. By randomly discarding network units, a strategy which is called as dropout can improve the performance of DNNs by reducing the influence of over-fitting. However, the random dropout strategy treats units indiscriminately, which may lose information on distributions of units outputs. In this paper, we improve the dropout strategy by differential treatment to units according to their outputs. Only minor changes to an existing neural network system can achieve a significant improvement. Experiments of phone recognition on TIMIT show that the sparse dropout fine-tuning gets significant performance improvement.
关键词Dropout Sparse Dropout Deep Neural Networks Deep Learning
收录类别EI
文献类型会议论文
条目标识符http://ir.ia.ac.cn/handle/173211/11776
专题模式识别国家重点实验室_机器人视觉
通讯作者Hao Zheng
作者单位National Laboratory of Pattern Recognition, Institute of Automation, Chinese Academy of Sciences
推荐引用方式
GB/T 7714
Zheng Hao,Mingming Chen,Wenju Liu,et al. Improving Deep Neural Networks by Using Sparse Dropout Strategy[C],2014.
条目包含的文件 下载所有文件
文件名称/大小 文献类型 版本类型 开放类型 使用许可
ChinaSIP-2014-1.pdf(146KB)会议论文 开放获取CC BY-NC-SA浏览 下载
个性服务
推荐该条目
保存到收藏夹
查看访问统计
导出为Endnote文件
谷歌学术
谷歌学术中相似的文章
[Zheng Hao]的文章
[Mingming Chen]的文章
[Wenju Liu]的文章
百度学术
百度学术中相似的文章
[Zheng Hao]的文章
[Mingming Chen]的文章
[Wenju Liu]的文章
必应学术
必应学术中相似的文章
[Zheng Hao]的文章
[Mingming Chen]的文章
[Wenju Liu]的文章
相关权益政策
暂无数据
收藏/分享
文件名: ChinaSIP-2014-1.pdf
格式: Adobe PDF
所有评论 (0)
暂无评论
 

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。