Zur Kurzanzeige

Knowledge Context for Entity and Relation Linking

dc.contributor.advisorAuer, Sören
dc.contributor.authorOnando, Isaiah Mulang'
dc.description.abstractKnowledge graphs (KGs) are structures that provide a compendious representation of real world facts about entities and their relationships. The last decade has seen an increase in the number, size and application of knowledge graphs especially owing to the easy accessibility of the World Wide Web as a knowledge store. Adding structure to this data implies that machines can easily interpret, reason with, and infer meanings across different domains.
Such rich stores of structured data have been proven to boost performances in core Natural Language Processing (NLP) tasks such as Relation Extraction, Question Answering, Dialog Systems, Web search, etc. Furthermore, owing to these vast structured knowledge stores new research and application areas have emerged, viz. automatic KG construction, KG completion, and KG Alignment. Central to these tasks is the need to align entities and their relations in text to equivalents in referent knowledge bases. However, the difference in representation of such relations within unstructured text as compared to the formally structured knowledge bases manifest major challenges namely: lexical gap, ambiguity, complex and implicit relations, the unpredictability of natural language vs formulaic knowledge bases, and complex grammar used in text etc. Numerous research efforts have sought to provide tools and approaches for text to KG disambiguation. Notwithstanding, the aforementioned challenges still remain obstacles to overcome.
This thesis makes two considerations to address entity and relation linking. We envision tools that harness both the power of deep learning methods as well as traditional Artificial Intelligence techniques. We also view the KG as a source of information that can be anchored as features to inform machine learning models. In this view, we propose encoding this curated information for the linking models. We first devise an approach called ReMatch to perform end-to-end relation linking. ReMatch represents essential attributes of the relations in short text and models KG relations in a complementary structure to enhance the similarity scoring process. A terminology graph is used to augment these two structures with synonym relations. Next, we perform end-to-end entity linking via an attention-based encoder-decoder neural network that captures signals from a infused background KG. In this context, our approach Arjun is a first attempt to encode entity information from Wikidata KG as contextual signals in a neural network architecture. There are two neural encoders used in Arjun, where the first one recognises entity mentions. We create a local KG infused from two open domain KGs to associate entities with their aliases. The infused KG is used to power another encoder network for the disambiguation. In a subsequent implementation, We extend the Arjun idea to perform end-to-end entity linking by leveraging the power of the state-of-the-art transformers. A fine-tuned transformer model recognises entity mentions in text, but allows for a mix-and-match approach to the candidate generation step. We then utilise entity descriptions in a second transformer model for disambiguation. In another direction, we experiment with KG triples to evaluate the impact of KG context on transformer models. We desire to unearth underlying nuances in KG entities, and define appropriate representations of the same for the learning models. This work provides insightful results for the community on types, encoding and extent of KG context for NLP. Finally, we employ the novel intuition gained to enhance a model for the explanation regeneration task in elementary science QA. Our contributions target a broader research agenda by providing efficient approaches that leverage information in KGs, and to propel efforts that obtain best of both KGs and NLP.
dc.rightsIn Copyright
dc.subjectAufmerksame Neuronale Netze
dc.subjectNeuronale Netze
dc.subjectDeep Learning
dc.subjectNatürliche Sprachverarbeitung
dc.subjectNatürliches Sprachverständnis
dc.subjectKnowledge Context
dc.subjectKnowledge Graph
dc.subjectEntity Disambiguation
dc.subjectEntity Linking
dc.subjectRelation Linking
dc.subjectRelation Extraction
dc.subjectAttentive Neural Networks
dc.subjectNeural Networks
dc.subjectNatural Language Processing
dc.subjectNatural Language Understanding
dc.subject.ddc004 Informatik
dc.titleKnowledge Context for Entity and Relation Linking
dc.typeDissertation oder Habilitation
dc.publisher.nameUniversitäts- und Landesbibliothek Bonn
ulbbnediss.affiliation.nameRheinische Friedrich-Wilhelms-Universität Bonn
ulbbnediss.instituteMathematisch-Naturwissenschaftliche Fakultät : Fachgruppe Informatik / Institut für Informatik
ulbbnediss.fakultaetMathematisch-Naturwissenschaftliche Fakultät
dc.contributor.coRefereeLehmann, Jens

Dateien zu dieser Ressource


Das Dokument erscheint in:

Zur Kurzanzeige

Die folgenden Nutzungsbestimmungen sind mit dieser Ressource verbunden: