Institutional Repository of Chinese Acad Sci, Inst Automat, Natl Lab Pattern Recognit, Beijing 100190, Peoples R China
Tucker decomposition-based temporal knowledge graph completion | |
Shao, Pengpeng1![]() ![]() ![]() ![]() ![]() | |
Source Publication | KNOWLEDGE-BASED SYSTEMS
![]() |
ISSN | 0950-7051 |
2022-02-28 | |
Volume | 238Pages:9 |
Abstract | Knowledge graphs have been demonstrated to be an effective tool for numerous intelligent applications. However, a large amount of valuable knowledge still exists implicitly in the knowledge graphs. To enrich the existing knowledge graphs, recent years have witnessed that many algorithms for link prediction and knowledge graphs embedding have been designed to infer new facts. But most of these studies focus on the static knowledge graphs and ignore the temporal information which reflects the validity of knowledge. Developing the model for temporal knowledge graphs completion is an increasingly important task. In this paper, we build a new tensor decomposition model for temporal knowledge graphs completion inspired by the Tucker decomposition of order-4 tensor. Furthermore, to further improve the basic model performance, we provide three kinds of methods including cosine similarity, contrastive learning, and reconstruction-based to incorporate the prior knowledge into the proposed model. Because the core tensor contains a large number of parameters on the proposed model, thus we present two embedding regularization schemes to avoid the over-fitting problem. By combining these two kinds of regularization with the proposed model, our model outperforms baselines with an explicit margin on three temporal datasets (i.e. ICEWS2014, ICEWS05-15, GDELT).& nbsp;(c) 2021 Published by Elsevier B.V. |
Keyword | Temporal knowledge graphs Tucker decomposition Reconstruction Contrastive learning |
DOI | 10.1016/j.knosys.2021.107841 |
Indexed By | SCI |
Language | 英语 |
Funding Project | National Key Research & Development Plan of China[2017YFC0820602] ; National Natural Science Foundation of China (NSFC)[61831022] ; National Natural Science Foundation of China (NSFC)[61771472] ; National Natural Science Foundation of China (NSFC)[61773379] ; National Natural Science Foundation of China (NSFC)[61901473] |
Funding Organization | National Key Research & Development Plan of China ; National Natural Science Foundation of China (NSFC) |
WOS Research Area | Computer Science |
WOS Subject | Computer Science, Artificial Intelligence |
WOS ID | WOS:000779180700014 |
Publisher | ELSEVIER |
Sub direction classification | 知识表示与推理 |
planning direction of the national heavy laboratory | 社会信息感知与理解 |
Paper associated data | 是 |
Citation statistics | |
Document Type | 期刊论文 |
Identifier | http://ir.ia.ac.cn/handle/173211/48253 |
Collection | 模式识别国家重点实验室_智能交互 |
Corresponding Author | Tao, Jianhua |
Affiliation | 1.Chinese Acad Sci, Inst Automat, Natl Lab Pattern Recognit, Beijing, Peoples R China 2.Univ Chinese Acad Sci, Sch Artificial Intelligence, Beijing, Peoples R China 3.CAS Ctr Excellence Brain Sci & Intelligence Techn, Beijing, Peoples R China |
First Author Affilication | Chinese Acad Sci, Inst Automat, Natl Lab Pattern Recognit, Beijing 100190, Peoples R China |
Corresponding Author Affilication | Chinese Acad Sci, Inst Automat, Natl Lab Pattern Recognit, Beijing 100190, Peoples R China |
Recommended Citation GB/T 7714 | Shao, Pengpeng,Zhang, Dawei,Yang, Guohua,et al. Tucker decomposition-based temporal knowledge graph completion[J]. KNOWLEDGE-BASED SYSTEMS,2022,238:9. |
APA | Shao, Pengpeng,Zhang, Dawei,Yang, Guohua,Tao, Jianhua,Che, Feihu,&Liu, Tong.(2022).Tucker decomposition-based temporal knowledge graph completion.KNOWLEDGE-BASED SYSTEMS,238,9. |
MLA | Shao, Pengpeng,et al."Tucker decomposition-based temporal knowledge graph completion".KNOWLEDGE-BASED SYSTEMS 238(2022):9. |
Files in This Item: | Download All | |||||
File Name/Size | DocType | Version | Access | License | ||
Tucker decomposition(611KB) | 期刊论文 | 作者接受稿 | 开放获取 | CC BY-NC-SA | View Download |
Items in the repository are protected by copyright, with all rights reserved, unless otherwise indicated.
Edit Comment