THESIS
2022
1 online resource (xvi, 108 pages) : illustrations (some color)
Abstract
When coming to understand the world, human minds construct structured knowledge from sparse,
noisy, and ambiguous data. Therefore, humanlike machine learning should perform inference over
hierarchies of flexibly structured data. Based on these beliefs, people usually construct real-world
data as hierarchies to formulate the machine learning problem, where the hierarchical data serve
as the hypotheses or the inference queries. In this thesis, we study learning with hierarchical data.
First, we look into the hierarchical data classification problem, where the hierarchical data act as
hypotheses. In specific, we investigate hierarchical text classification and propose a path cost-sensitive
learning algorithm to utilize the structural information of classes. Then we pay much
attention to ex...[
Read more ]
When coming to understand the world, human minds construct structured knowledge from sparse,
noisy, and ambiguous data. Therefore, humanlike machine learning should perform inference over
hierarchies of flexibly structured data. Based on these beliefs, people usually construct real-world
data as hierarchies to formulate the machine learning problem, where the hierarchical data serve
as the hypotheses or the inference queries. In this thesis, we study learning with hierarchical data.
First, we look into the hierarchical data classification problem, where the hierarchical data act as
hypotheses. In specific, we investigate hierarchical text classification and propose a path cost-sensitive
learning algorithm to utilize the structural information of classes. Then we pay much
attention to exploring the geometric representation learning for hierarchical structures in knowledge
graphs, in which case the hierarchical data are regarded as inference queries. The choice
of geometric space for knowledge graph embeddings can have significant effects on the multi-relational
knowledge graph inference. Transitivity, which forms the hierarchical structure, is a
special property that can be modeled more naturally by the hyperbolic geometry instead of the traditional
Euclidean embedding models. To build a representation learning framework for various
structures in knowledge graphs, we propose to learn the embeddings in different geometric spaces
and apply manifold alignment to align the shared entities. We also focus on the representation of
the single-relational hierarchical structures. To improve the hyperbolic embeddings, we propose to learn the embeddings of hierarchical data in the complex hyperbolic space, which has a more powerful
representation capacity to capture a variety of hierarchical structures. Finally, we extend the
representation capacity of the complex hyperbolic geometry in multi-relational knowledge graph
embeddings. We propose to use the fast Fourier transform as a simple and effective solution to
apply the real hyperbolic geometric transformations and the attention mechanism in the complex hyperbolic space.
Post a Comment