CASIA OpenIR  > 模式识别国家重点实验室  > 自然语言处理
Conditional Generative Adversarial Networks for Commonsense Machine Comprehension
Wang Bingning1,2; Liu Kang1; Zhao Jun1,2
2017-08
会议名称Twenty-Sixth International Joint Conference on Artificial Intelligence
页码4123-4129
会议日期2017-8-19
会议地点Australia
摘要Recently proposed Story Cloze Test [Mostafazadeh et al., 2016] is a commonsense machine comprehension application to deal with natural language understanding problem. This dataset contains a lot of story tests which require commonsense inference ability. Unfortunately, the training data is almost unsupervised where each context document followed with only one positive sentence that can be inferred from the context. However, in the testing period, we must make inference from two candidate sentences. To tackle this problem, we employ the generative adversarial networks (GANs) to generate fake sentence. We proposed a Conditional GANs (CGANs) in which the generator is conditioned by the context. Our experiments show the advantage of the CGANs in discriminating sentence and achieve state-of-the-art results in commonsense story reading comprehension task compared with previous feature engineering and deep learning methods.
关键词Generative Adversarial Networks Commonsense Reasoning Machine Comprehension
收录类别EI
文献类型会议论文
条目标识符http://ir.ia.ac.cn/handle/173211/20032
专题模式识别国家重点实验室_自然语言处理
通讯作者Liu Kang
作者单位1.中国科学院自动化研究所
2.中国科学院大学
推荐引用方式
GB/T 7714
Wang Bingning,Liu Kang,Zhao Jun. Conditional Generative Adversarial Networks for Commonsense Machine Comprehension[C],2017:4123-4129.
条目包含的文件 下载所有文件
文件名称/大小 文献类型 版本类型 开放类型 使用许可
Conditional Generati(404KB)会议论文 开放获取CC BY-NC-SA浏览 下载
个性服务
推荐该条目
保存到收藏夹
查看访问统计
导出为Endnote文件
谷歌学术
谷歌学术中相似的文章
[Wang Bingning]的文章
[Liu Kang]的文章
[Zhao Jun]的文章
百度学术
百度学术中相似的文章
[Wang Bingning]的文章
[Liu Kang]的文章
[Zhao Jun]的文章
必应学术
必应学术中相似的文章
[Wang Bingning]的文章
[Liu Kang]的文章
[Zhao Jun]的文章
相关权益政策
暂无数据
收藏/分享
文件名: Conditional Generative Adversarial Networks for Commonsense Machine Comprehension.pdf
格式: Adobe PDF
所有评论 (0)
暂无评论
 

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。