计算机科学与探索 ›› 2023, Vol. 17 ›› Issue (9): 2174-2183.DOI: 10.3778/j.issn.1673-9418.2207090

• 人工智能·模式识别 • 上一篇    下一篇

基于层次保留的知识图谱嵌入链路预测方法

钱付兰,王文学,郑文杰,陈洁,赵姝   

  1. 1. 安徽大学 计算机科学与技术学院,合肥 230601
    2. 计算智能与信号处理教育部重点实验室,合肥 230601
    3. 信息材料和智能感知安徽省实验室,合肥 230601
    4. 南京理工大学 计算机科学与工程学院,南京 210094
  • 出版日期:2023-09-01 发布日期:2023-09-01

Reserved Hierarchy-Based Knowledge Graph Embedding for Link Prediction

QIAN Fulan, WANG Wenxue, ZHENG Wenjie, CHEN Jie, ZHAO Shu   

  1. 1. College of Computer Science and Technology, Anhui University, Hefei 230601, China
    2. Key Laboratory of Intelligent Computing & Signal Processing, Ministry of Education, Hefei 230601, China
    3. Information Materials and Intelligent Sensing Laboratory of Anhui Province, Hefei 230601, China
    4. College of Computer Science and Technology, Nanjing University of Science and Technology, Nanjing 210094, China
  • Online:2023-09-01 Published:2023-09-01

摘要: 知识图谱嵌入(KGE)是预测知识图谱(KGs)中缺失链接的重要工具,它将知识图谱中的实体和关系嵌入到连续低维空间中,并尽可能地保留原数据中隐含的各种信息。近年来,一些知识图谱嵌入方法利用极坐标系对知识图谱中普遍存在的语义层次结构进行建模,提升了链路预测任务的性能。然而,这些方法在建模关系时,使用了简单的标度变换并过度关注于实体的层次差,这在一定程度上限制了模型的拟合力。为了应对上述问题,提出了基于层次保留的知识图谱嵌入方法(RHKE),它在建模知识图谱中的关系时考虑了实体本身的层次。具体来说,提出了混合变换,它包含一个倍率项和一个偏差项,当实体层次较低或较高时,标度变换主要受偏差项或倍率项影响。此外,由于变换后模型丢失了实体原本的层次,RHKE使用层次修正项,它将头尾实体的原本层次用不同比例组合后作为关系的附加信息。在多个公开数据集上的实验结果显示,RHKE在链路预测上的性能优于现有的语义层次模型。

关键词: 知识图谱嵌入(KGE), 链路预测, 语义层次, 极坐标系

Abstract: Knowledge graph embedding (KGE) is an important tool to predict the missing links of knowledge graphs (KGs). It embeds entities and relations of KGs into continuous low-dimensional space and preserves the potential information of the original data as much as possible. Recently, some KGE methods model the common semantic hierarchies of KGs by utilizing polar coordinate system and improve the performance in link prediction task. However, they use the simple transform function and only focus on the hierarchical differences of entities when modeling relations, which limits the performance of the model. To address this issue, this paper proposes reserved hierarchy-based knowledge graph embeddings (RHKE). It considers the hierarchy of entity itself when modeling the relation. To be specific, RHKE proposes the mixed transform function, which contains a proportion item and a bias item. The transform function will be mainly affected by the proportion item or bias item when the hierarchy of entity is high or low. In addition, since the model loses the hierarchy of entity itself after the mixed transform, RHKE uses the hierarchy correction item, which is an additional information for the relation by combining the original hierarchy of the head and tail entity with different proportions. Experiments on several public datasets show that RHKE outperforms existing semantic hierarchical models in link prediction task.

Key words: knowledge graph embedding (KGE), link prediction, semantic hierarchy, polar coordinates