CASIA OpenIR  > 模式识别国家重点实验室  > 自然语言处理
吴惠甲; 张家俊; 宗成庆
Source Publication软件学报
Abstract范畴标注是组合范畴语法解析中的子任务之一,可用于提高解析器的效率和性能.传统的最大熵模型需 要手工定义特征模板,神经网络则通过隐含层学习到离散特征的分布式表示,从而自动提取分类需要的特征.引入该 模型来解决该问题,在原有神经语言模型的基础上加入了向量化的词性表示层和范畴表示层,并通过反向传播自动 更新词向量、词性向量和范畴向量,学习到它们的分布式表示.此外,在预测时采用束搜索的序列解码方式来引入标 签之间的依赖信息.实验结果表明,这两种改进都能提升模型的性能,使其在范畴标注任务上比传统的最大熵模型效 果要好(提升1%).
Other Abstract
As a sub-task for combinatory categorical grammar (CCG) based parsing, categorical tagging can improve parsing efficiency and accuracy. While traditional maximum entropy model solves this problem by designing meaningful feature templates, neural network can extract features automatically based on distributed representations. This paper proposes a neural categorical tagging model with two improvements. First, word embedding layer is extended with a part-of-speech embedding layer and a category embedding layer, which facilitates learning their distributed representations jointly by the back-propagation algorithm. Secondly, a beam search is used in the decoding to capture the dependencies among tags. These two improvements make the proposed model more accurate than the state-of-art maximum entropy based tagger (up to 1%).
Keyword范畴标注 分布式表示 神经语言模型
Document Type期刊论文
Recommended Citation
GB/T 7714
吴惠甲,张家俊,宗成庆. 一种神经范畴标注模型[J]. 软件学报,2016(27):2691-2700.
APA 吴惠甲,张家俊,&宗成庆.(2016).一种神经范畴标注模型.软件学报(27),2691-2700.
MLA 吴惠甲,et al."一种神经范畴标注模型".软件学报 .27(2016):2691-2700.
Files in This Item: Download All
File Name/Size DocType Version Access License
一种神经范畴标注模型.pdf(446KB)期刊论文作者接受稿开放获取CC BY-NC-SAView Download
Related Services
Recommend this item
Usage statistics
Export to Endnote
Google Scholar
Similar articles in Google Scholar
[吴惠甲]'s Articles
[张家俊]'s Articles
[宗成庆]'s Articles
Baidu academic
Similar articles in Baidu academic
[吴惠甲]'s Articles
[张家俊]'s Articles
[宗成庆]'s Articles
Bing Scholar
Similar articles in Bing Scholar
[吴惠甲]'s Articles
[张家俊]'s Articles
[宗成庆]'s Articles
Terms of Use
No data!
Social Bookmark/Share
File name: 一种神经范畴标注模型.pdf
Format: Adobe PDF
All comments (0)
No comment.

Items in the repository are protected by copyright, with all rights reserved, unless otherwise indicated.