KG-TRICK: Combining Textual Knowledge and Relations Completion of knowledge graphs for multilingual knowledge

Multilingual knowledge graphs (KGs) provide high-level relational and textual information in various NLP systems, but are often incomplete, especially in non-English languages. Previous research has shown that combining knowledge from KGs in different languages facilitates Knowledge Graph Completion (KGC), the task of predicting missing relationships between entities, or Knowledge Graph Enhancement (KGE), the task of predicting missing textual knowledge of entities. Although previous efforts have considered KGC and KGE as independent functions, we hypothesize that they are interdependent and mutually beneficial. To this end, we present KG-TRICK, a novel sequence-to-sequence framework that integrates textual information completion functions and multilingual KG relations. KG-TRICK shows that: i) it is possible to combine the functions of KGC and KGE into a single framework, and ii) combining textual information from multiple languages is beneficial to improve the completeness of KG. As part of our offerings, we also present WikiKGE10++, the largest hand-picked benchmark for the completion of textual knowledge of KGs, which includes over 25,000 entities in 10 different languages.