Ontology Embedding using the BERT Model

Supervisor: Jieying Chen (j.chen2@vu.nl)


Ontologies have become an indispensable tool for the semantic web, knowledge graph generation, and intelligent systems. Traditionally, ontologies are expressed in symbolic forms, which sometimes lack the ability to capture latent semantics or support tasks such as similarity computation. Recent advances in neural embeddings, particularly the Bidirectional Encoder Representations from Transformers (BERT) model, have shown significant potential in capturing contextual nuances in textual data. Embedding ontologies using the BERT model could bridge the gap between symbolic representations and continuous vector spaces, offering richer semantic interpretations and enabling a host of downstream applications.


  1. Analyze the current landscape of ontology embeddings, understanding the capabilities, strengths, and weaknesses of existing methods while spotlighting the unique features of the BERT model in textual embeddings.
  2. Design a framework to convert traditional ontology structures into a form suitable for embedding using BERT, ensuring preservation of hierarchical and relational information.
  3. Adapt and fine-tune BERT on specific ontology datasets, emphasizing the capture of both explicit and implicit semantic information.
  4. Develop an evaluation protocol that compares the BERT-based embeddings with traditional ontology representations in terms of semantic coherence, representation fidelity, and utility in downstream tasks.
  5. Explore the utility of the BERT-embedded ontologies in tasks like semantic search, ontology alignment, and similarity computations, benchmarking against traditional methods.