Different Contexts Lead to Different Word Embeddings | |
Hu WP(胡文鹏); Zhang JJ(张家俊); Zheng N(郑楠) | |
发表期刊 | Proceedings of COLING 2016: Technical Papers |
2016 | |
摘要 | Recent work for learning word representations has applied successfully to many NLP applications, such as sentiment analysis and question answering. However, most of these models assume a single vector per word type without considering polysemy and homonymy. In this paper, we present an extension to the CBOW model which not only improves the quality of embeddings but also makes embeddings suitable for polysemy. It differs from most of the related work in that it learns one semantic center embedding and one context bias instead of training multiple embeddings per word type. Different context leads to different bias which is defined as the weighted average embeddings of local context. Experimental results on similarity task and analogy task show that the word representations learned by the proposed method outperform the competitive baselines. |
收录类别 | EI |
文献类型 | 期刊论文 |
条目标识符 | http://ir.ia.ac.cn/handle/173211/40834 |
专题 | 复杂系统管理与控制国家重点实验室 |
推荐引用方式 GB/T 7714 | Hu WP,Zhang JJ,Zheng N. Different Contexts Lead to Different Word Embeddings[J]. Proceedings of COLING 2016: Technical Papers,2016. |
APA | Hu WP,Zhang JJ,&Zheng N.(2016).Different Contexts Lead to Different Word Embeddings.Proceedings of COLING 2016: Technical Papers. |
MLA | Hu WP,et al."Different Contexts Lead to Different Word Embeddings".Proceedings of COLING 2016: Technical Papers (2016). |
条目包含的文件 | 条目无相关文件。 |
除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。
修改评论