Different Contexts Lead to Different Word Embeddings
Hu WP(胡文鹏); Zhang JJ(张家俊); Zheng N(郑楠)
2016
发表期刊Proceedings of COLING 2016: Technical Papers
摘要

Recent work for learning word representations has applied successfully to many NLP applications, such as sentiment analysis and question answering. However, most of these models assume a single vector per word type without considering polysemy and homonymy. In this paper, we present an extension to the CBOW model which not only improves the quality of embeddings but also makes embeddings suitable for polysemy. It differs from most of the related work in that it learns one semantic center embedding and one context bias instead of training multiple embeddings per word type. Different context leads to different bias which is defined as the weighted average embeddings of local context. Experimental results on similarity task and analogy task show that the word representations learned by the proposed method outperform the competitive baselines. 


收录类别EI
文献类型期刊论文
条目标识符http://ir.ia.ac.cn/handle/173211/14714
专题复杂系统管理与控制国家重点实验室_复杂系统研究
作者单位中国科学院自动化研究所
推荐引用方式
GB/T 7714
Hu WP,Zhang JJ,Zheng N. Different Contexts Lead to Different Word Embeddings[J]. Proceedings of COLING 2016: Technical Papers,2016.
APA Hu WP,Zhang JJ,&Zheng N.(2016).Different Contexts Lead to Different Word Embeddings.Proceedings of COLING 2016: Technical Papers.
MLA Hu WP,et al."Different Contexts Lead to Different Word Embeddings".Proceedings of COLING 2016: Technical Papers (2016).
条目包含的文件 下载所有文件
文件名称/大小 文献类型 版本类型 开放类型 使用许可
Different Contexts L(265KB)会议论文 开放获取CC BY-NC-SA浏览 下载
个性服务
推荐该条目
保存到收藏夹
查看访问统计
导出为Endnote文件
谷歌学术
谷歌学术中相似的文章
[Hu WP(胡文鹏)]的文章
[Zhang JJ(张家俊)]的文章
[Zheng N(郑楠)]的文章
百度学术
百度学术中相似的文章
[Hu WP(胡文鹏)]的文章
[Zhang JJ(张家俊)]的文章
[Zheng N(郑楠)]的文章
必应学术
必应学术中相似的文章
[Hu WP(胡文鹏)]的文章
[Zhang JJ(张家俊)]的文章
[Zheng N(郑楠)]的文章
相关权益政策
暂无数据
收藏/分享
文件名: Different Contexts Lead to Different Word Embeddings.pdf
格式: Adobe PDF
所有评论 (0)
暂无评论
 

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。