Raker: A Relation-aware Knowledge Reasoning Model for Inductive Relation Prediction
Jiaqi Wang, Wengen Li, Yulou Shu, Jihong Guan, Yichao Zhang, Shuigeng Zhou
Inductive relation prediction, an important task for knowledge graph completion, is to predict the relations between entities that are unseen at the training stage. The latest methods use pre-trained language models (PLMs) to encode the paths between the head entity and tail entity, and achieve state-of-the-art prediction performance. However, these methods cannot handle no-path scenarios well and lack the capability to learn comprehensive relation representations for distinguishing different relations. To tackle this issue, we propose a novel
R
elation-
a
ware
k
nowledg
e
r
easoning model entitled Raker which introduces an adaptive reasoning information extraction method to identify relation-aware reasoning neighbors of entities in the target triple to handle no-path scenarios, and enables the PLM to better distinguish different relations via the relation-specific soft prompting. Raker is evaluated on three public datasets and achieves SOTA performance in inductive relation prediction when compared with the baseline methods. Notably, the absolute improvement of Raker is even more than 5% on the FB15k-237 dataset in the inductive setting. Moreover, Raker also demonstrates the superiority in transductive, few-shot, and unseen relations settings. The code of Raker is available at