SHTM: A Neocortex-inspired Algorithm for One-shot Text Generation
Wang YW(王寓巍)1,2; Ceng Y(曾毅)1,2; Xu B(徐波)1,2
2016-10
会议名称2016 IEEE International Conference on Systems, Man, and Cybernetics
会议日期2016-10-09--2016-10-12
会议地点Budapest, Hungary 978-1-
摘要
Text generation is a typical nature language processing
task, and is the basis of machine translation and question
answering. Deep learning techniques can get good performance
on this task under the condition that huge number of parameters
and mass of data are available for training. However, human
beings do not learn in this way. People combine knowledge
learned before and something new with only few samples. This
process is called one-shot learning. In this paper, we propose
a neocortex based computational model, Semantic Hierarchical
Temporal Memory model (SHTM), for one-shot text generation.
The model is refined from Hierarchical Temporal Memory model.
LSTM is used for comparative study. Results on three public
datasets show that SHTM performs much better than LSTM on
the measures of mean precision and BLEU score. In addition,
we utilize SHTM model to do question answering in the fashion
of text generation and verifying its superiority.
关键词One-shot Learing Htm Text Generation
收录类别EI
文献类型会议论文
条目标识符http://ir.ia.ac.cn/handle/173211/14744
专题数字内容技术与服务研究中心_听觉模型与认知计算
作者单位1.Institute of Automation, Chinese Academy of Sciences, Beijing, China
2.Center for Excellence in Brain Science and Intelligence Technology, Chinese Academy of Sciences, Shanghai, China
推荐引用方式
GB/T 7714
Wang YW,Ceng Y,Xu B. SHTM: A Neocortex-inspired Algorithm for One-shot Text Generation[C],2016.
条目包含的文件 下载所有文件
文件名称/大小 文献类型 版本类型 开放类型 使用许可
1238_smc2016.pdf(1600KB)会议论文 开放获取CC BY-NC-SA浏览 下载
个性服务
推荐该条目
保存到收藏夹
查看访问统计
导出为Endnote文件
谷歌学术
谷歌学术中相似的文章
[Wang YW(王寓巍)]的文章
[Ceng Y(曾毅)]的文章
[Xu B(徐波)]的文章
百度学术
百度学术中相似的文章
[Wang YW(王寓巍)]的文章
[Ceng Y(曾毅)]的文章
[Xu B(徐波)]的文章
必应学术
必应学术中相似的文章
[Wang YW(王寓巍)]的文章
[Ceng Y(曾毅)]的文章
[Xu B(徐波)]的文章
相关权益政策
暂无数据
收藏/分享
文件名: 1238_smc2016.pdf
格式: Adobe PDF
所有评论 (0)
暂无评论
 

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。