作者: Jingwei Cheng , Fu Zhang , Zhi Yang
DOI: 10.1109/ACCESS.2020.3035636
关键词:
摘要: A Knowledge Graph (KG) is a directed graph with nodes as entities and edges relations. KG representation learning (KGRL) aims to embed relations in into continuous low-dimensional vector spaces, so simplify the manipulation while preserving inherent structure of KG. In this paper, we propose embedding framework, namely MCapsEED (Multi-Scale Capsule-based Embedding Model Incorporating Entity Descriptions). employs Transformer combination relation attention mechanism identify relation-specific part an entity description obtain entity. The structured representations are integrated synthetic representation. 3-column matrix each column element triple fed Multi-Scale model produce final head entity, tail relation. Experiments show that achieves better performance than state-of-the-art models for task link prediction on four benchmark datasets. Our code can be found at https://github.com/1780041410/McapsEED .