Frontiers of Data and Computing ›› 2023, Vol. 5 ›› Issue (3): 123-137.

CSTR: 32002.14.jfdc.CN10-1649/TP.2023.03.009

doi: 10.11871/jfdc.issn.2096-742X.2023.03.009

• Technology and Application • Previous Articles     Next Articles

Hierarchical Attention-Based Bidirectional Long Short-Term Memory Networks for Knowledge Graph Completion

ZHANG Xiaofan(),SUN Haichun,LI Xin*()   

  1. Information and Network Security College, People’s Public University of China, Beijing 100038, China
  • Received:2022-05-17 Online:2023-06-20 Published:2023-06-21

Abstract:

[Objective] In order to utilize both local and global features hidden in relation paths between two entities in knowledge graphs which are presently often neglected in most path-based embedding methods for knowledge graph completion. In this paper, we propose a new bidirectional long short-term memory network approach based on hierarchical attention mechanisms, which process relation paths at both entity-relation level and relation path level. [Methods] After scanning the relation paths between two entities, these relation paths are vectorized considering the entity types and the relations on the paths. Then we introduce the paths into a low-dimensional space using a BiLSTM model which has hierarchical attention layers to capture the important facts at different levels. Finally, a prediction is made based on the similarity between the feature vector and the potentially possible relations. [Results] The model conducts link prediction tasks over several datasets including NELL-995 and FB15k-237. The results show that the MAP score of the HAN-BiLSTM model is 1.8% better than traditional methods such as CNN-BiLSTM, as well as an improvement in Hits@1 by 1.4%. The model achieved a Hits@3 score of 0.988 over the Kinship dataset.[Conclusions] Experiment results show that the proposed algorithm can effectively extract both global and local features of the relation paths, so as to improve the effect of knowledge graph completion

Key words: knowledge graph completion, path-based reasoning, hierarchical attention networks, bidirectional long short-term memory networks