THESIS
2025
1 online resource (xxii, 196 pages) : illustrations (chiefly color)
Abstract
Complex logical query answering on knowledge graphs is a pivotal challenge in KG reasoning, necessitating advanced methods to navigate multi-hop relations and logical operations within incomplete datasets. This thesis advances the field through a series of innovative contributions that enhance the efficiency, scope, and applicability of KG reasoning. We introduce Query2Particles, a novel particle-based embedding technique that excels at capturing distributed answer sets in complex queries, improving upon traditional vector-based representations. To optimize query processing, we propose Sequential Query Encoding, which transforms computational graphs into sequential forms, preserving structural and semantic integrity for enhanced efficiency. Addressing the integration of numerical data,...[
Read more ]
Complex logical query answering on knowledge graphs is a pivotal challenge in KG reasoning, necessitating advanced methods to navigate multi-hop relations and logical operations within incomplete datasets. This thesis advances the field through a series of innovative contributions that enhance the efficiency, scope, and applicability of KG reasoning. We introduce Query2Particles, a novel particle-based embedding technique that excels at capturing distributed answer sets in complex queries, improving upon traditional vector-based representations. To optimize query processing, we propose Sequential Query Encoding, which transforms computational graphs into sequential forms, preserving structural and semantic integrity for enhanced efficiency. Addressing the integration of numerical data, we develop the Number Reasoning Network, extending KG reasoning to encompass both entity relations and numerical attributes, a critical advancement for real-world knowledge bases. For temporal and event-based reasoning, we present Memory-Enhanced Query Encoding, tailored for eventuality knowledge graphs. This method incorporates implicit temporal and logical constraints, enabling precise reasoning over event sequences. In practical applications, we design the Logical Session Graph Transformer to decipher inter-session user intentions in recommendation systems, show-casing the real-world impact of our techniques. Additionally, we pioneer RLF-KG, a reinforcement learning approach for abductive reasoning, facilitating the generation of com-plex logical hypotheses to explain observed data in KGs. Evaluated extensively on bench-mark datasets, our methods consistently achieve state-of-the-art performance across di-verse reasoning tasks. Collectively, these contributions provide robust, versatile solutions for complex query answering and hypothesis generation, significantly advancing theoretical and applied dimensions of knowledge graph reasoning.
Post a Comment