KIT | KIT-Bibliothek | Impressum | Datenschutz

Contextual language models for knowledge graph completion

Biswas, Russa ORCID iD icon; Sofronova, Radina; Alam, Mehwish; Sack, Harald


Knowledge Graphs (KGs) have become the backbone of various machine learning based applications over the past decade. However, the KGs are often incomplete and inconsistent. Several representation learning based approaches have been introduced to complete the missing information in KGs. Besides, Neural Language Models (NLMs) have gained huge momentum in NLP applications. However, exploiting the contextual NLMs to tackle the Knowledge Graph Completion (KGC) task is still an open research problem. In this paper, a GPT-2 based KGC model is proposed and is evaluated on two benchmark datasets. The initial results obtained from the fine-tuning of the GPT-2 model for triple classification strengthens the importance of usage of NLMs for KGC. Also, the impact of contextual language models for KGC has been discussed.

Verlagsausgabe §
DOI: 10.5445/IR/1000140757
Veröffentlicht am 06.12.2021
Cover der Publikation
Zugehörige Institution(en) am KIT Institut für Angewandte Informatik und Formale Beschreibungsverfahren (AIFB)
Publikationstyp Proceedingsbeitrag
Publikationsjahr 2021
Sprache Englisch
Identifikator ISSN: 1613-0073
KITopen-ID: 1000140757
Erschienen in MLSMKG 2021: Machine Learning with Symbolic Methods and Knowledge Graphs 2021. Ed.: M. Alam
Veranstaltung Machine Learning with Symbolic Methods and Knowledge Graphs: co-located with European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases (ECML PKDD 2021) (MLSMKG 2021), Online, 17.09.2021
Seiten Art.-Nr. 3
Serie CEUR Workshop Proceedings ; 2997
Schlagwörter GPT-2; Knowledge Graph Embedding; Triple Classification
Nachgewiesen in Scopus
KIT – Die Forschungsuniversität in der Helmholtz-Gemeinschaft
KITopen Landing Page