About | Contact Us | Register | Login
ProceedingsSeriesJournalsSearchEAI
sis 25(4):

Editorial

Leveraging Relation Attention Mechanisms for Enhanced Knowledge Graph Completion with Embedding Translation

Download5 downloads
Cite
BibTeX Plain Text
  • @ARTICLE{10.4108/eetsis.9117,
        author={Jiahao Shi and Zhengping Lin and Yuzhong Zhou and Yuliang Yang and Jie Lin},
        title={Leveraging Relation Attention Mechanisms for Enhanced Knowledge Graph Completion with Embedding Translation},
        journal={EAI Endorsed Transactions on Scalable Information Systems},
        volume={12},
        number={4},
        publisher={EAI},
        journal_a={SIS},
        year={2025},
        month={10},
        keywords={Knowledge graph, relation attention mechanism, embedding translation, performance evaluation},
        doi={10.4108/eetsis.9117}
    }
    
  • Jiahao Shi
    Zhengping Lin
    Yuzhong Zhou
    Yuliang Yang
    Jie Lin
    Year: 2025
    Leveraging Relation Attention Mechanisms for Enhanced Knowledge Graph Completion with Embedding Translation
    SIS
    EAI
    DOI: 10.4108/eetsis.9117
Jiahao Shi1, Zhengping Lin1,*, Yuzhong Zhou1, Yuliang Yang1, Jie Lin1
  • 1: Electric Power Research Institute of China Southern Power Grid Company
*Contact email: zhengping_lin@hotmail.com

Abstract

In this paper, we propose a novel knowledge graph completion framework to leverage a relation-specific attention mechanism integrated with an embedding translation strategy to improve the accuracy and contextual understanding of link prediction tasks. Unlike traditional models that rely on fixed transformation spaces, the proposed method dynamically captures fine-grained relational semantics by combining hierarchical candidate categorization, relation-guided entity projection, and asymmetric score functions. Specifically, the proposed model employs K-means clustering and principal component analysis (PCA) to identify semantically consistent entity sets, and integrates attention-weighted multi-attribute embeddings to construct robust relational representations. A margin-based ranking loss with normalized embedding constraints ensures effective optimization, further supported by Xavier initialization and stochastic gradient descent. Extensive experiments on two benchmark datasets, WN18 and FB15K, demonstrate the superiority of the proposed method. Specifically, on WN18, the proposed method achieves the lowest mean rank (MR) of 144, with competitive results in mean reciprocal rank (MRR) (0.902), Hits@1 (89.0%), Hits@3 (90.4%), and Hits@10 (96.3%), closely rivaling state- of-the-art models like QuatE and ComplEx. On FB15K, the proposed method again delivers the best Mean Rank of 21, along with strong scores in MRR (0.831), Hits@1 (72.2%), Hits@3 (88.4%), and the highest Hits@10 (92.5%) among all compared methods.

Keywords
Knowledge graph, relation attention mechanism, embedding translation, performance evaluation
Received
2025-04-18
Accepted
2025-09-19
Published
2025-10-07
Publisher
EAI
http://dx.doi.org/10.4108/eetsis.9117

Copyright © 2025 Zhengping Lin et al., licensed to EAI. This is an open access article distributed under the terms of the Creative Commons Attribution license, which permits unlimited use, distribution and reproduction in any medium so long as the original work is properly cited.

EBSCOProQuestDBLPDOAJPortico
EAI Logo

About EAI

  • Who We Are
  • Leadership
  • Research Areas
  • Partners
  • Media Center

Community

  • Membership
  • Conference
  • Recognition
  • Sponsor Us

Publish with EAI

  • Publishing
  • Journals
  • Proceedings
  • Books
  • EUDL