Skip to main content

OntoLearner: A Modular Python Library for Ontology Learning with LLMs.

Project description

OntoLearner Logo

OntoLearner: A Modular Python Library for Ontology Learning with LLMs

PyPI version PyPI Downloads License: MIT Hugging Face Collection pre-commit Documentation Status Maintenance DOI

OntoLearner is a modular and extensible architecture designed to support ontology learning and reuse. The conceptual and functional architecture of OntoLearner is shown as following. The framework comprises three core components—Ontologizers, Learning Tasks, and Learner Models—structured to enable reusable and customizable ontology engineering workflows.

🧪 Installation

OntoLearner is available on PyPI and you can install using pip:

pip install ontolearner

Next, verify the installation:

import ontolearner

print(ontolearner.__version__)

🔗 Essential Resources

Resource Info
📚 OntoLearner Documentation Dive into OntoLearner's extensive documentation to explore its modular architecture, including Ontologizers, Learning Tasks, and Learner Models. The documentation provides detailed guides, references, and tutorials to help you get started and make the most of OntoLearner's capabilities.
🤗 Datasets on Hugging Face You can access the curated colloctions of machine-readable ontologies across diverse domains such as agriculture, medicine, social sciences, and more. OntoLearner Benchmarking datasets are optimized for integration into generative AI pipelines, supporting versioning, streaming, and metadata inspection.

🚀 Quick Tour

Get started with OntoLearner in just a few lines of code. This guide demonstrates how to initialize ontologies, load datasets, and train an LLM-assisted learner for ontology engineering tasks.

Basic Usage - Automatic Download from Hugging Face:

from ontolearner.ontology import Wine

# 1. Initialize an ontologizer from OntoLearner
ontology = Wine()

# 2. Load the ontology automatically from Hugging Face
ontology.load()

# 3. Extract the learning task dataset
data = ontology.extract()

Basic Usage - Manual Download from Hugging Face:

from ontolearner.ontology import Wine

# 1. Initialize an ontologizer from OntoLearner
ontology = Wine()

# 2. Download the ontology from Hugging Face
file_path = ontology.from_huggingface()

LLM-Based Learning Pipeline:

from ontolearner import ontology, utils, learner

# 1. Load the ontology and extract training data
onto = ontology.Wine()
data = onto.extract()

# 2. Split into train and test sets
train_data, test_data = utils.train_test_split(
    data, test_size=0.2, random_state=42
)

# 3. Initialize a Retrieval-Augmented Generation (RAG) learner
retriever = learner.BERTRetrieverLearner()
llm = learner.AutoLearnerLLM()
prompt = learner.StandardizedPrompting(task="term-typing")

rag_learner = learner.AutoRAGLearner(
    learner_retriever=retriever,
    learner_llm=llm,
    prompting=prompt
)

# 4. Load pretrained components
rag_learner.load(
    retriever_id="sentence-transformers/all-MiniLM-L6-v2",
    llm_id="mistralai/Mistral-7B-Instruct-v0.1"
)

# 5. Fit the model to training data
rag_learner.fit(train_data=train_data, task="term-typing")

# 6. Predict on test data
predicted = rag_learner.predict(test_data, task="term-typing")

⭐ Contribution

We welcome contributions to enhance OntoLearner and make it even better! Please review our contribution guidelines in CONTRIBUTING.md before getting started. You are also welcome to assist with the ongoing maintenance by referring to MAINTENANCE.md. Your support is greatly appreciated.

If you encounter any issues or have questions, please submit them in the GitHub issues tracker.

💡 Acknowledgements

If you find this repository helpful or use OntoLearner in your work or research, feel free to cite our publication:

@inproceedings{babaei2023llms4ol,
  title={LLMs4OL: Large language models for ontology learning},
  author={Babaei Giglou, Hamed and D’Souza, Jennifer and Auer, S{\"o}ren},
  booktitle={International Semantic Web Conference},
  pages={408--427},
  year={2023},
  organization={Springer}
}

or:

@software{babaei_giglou_2025_15399783,
  author       = {Babaei Giglou, Hamed and D'Souza, Jennifer and Aioanei, Andrei and Mihindukulasooriya, Nandana and Auer, Sören},
  title        = {OntoLearner: A Modular Python Library for Ontology Learning with LLMs},
  month        = may,
  year         = 2025,
  publisher    = {Zenodo},
  version      = {v1.0.1},
  doi          = {10.5281/zenodo.15399783},
  url          = {https://doi.org/10.5281/zenodo.15399783},
}

This software is archived in Zenodo under the DOI DOI and is licensed under License: MIT.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ontolearner-1.2.1.tar.gz (416.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

ontolearner-1.2.1-py3-none-any.whl (128.8 kB view details)

Uploaded Python 3

File details

Details for the file ontolearner-1.2.1.tar.gz.

File metadata

  • Download URL: ontolearner-1.2.1.tar.gz
  • Upload date:
  • Size: 416.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/2.1.3 CPython/3.10.18 Linux/6.11.0-1015-azure

File hashes

Hashes for ontolearner-1.2.1.tar.gz
Algorithm Hash digest
SHA256 89bbc2ec1bc6bdc4eed0cbf00e29749b73541cccaccbebeca852fb1b3122ae9b
MD5 fc8c49e393d6d8291123b11cf3a11df0
BLAKE2b-256 3e0ddbdb7198f458c24cc3fb7f80460cc7413328f9da6b3f126f5118c3af7596

See more details on using hashes here.

File details

Details for the file ontolearner-1.2.1-py3-none-any.whl.

File metadata

  • Download URL: ontolearner-1.2.1-py3-none-any.whl
  • Upload date:
  • Size: 128.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/2.1.3 CPython/3.10.18 Linux/6.11.0-1015-azure

File hashes

Hashes for ontolearner-1.2.1-py3-none-any.whl
Algorithm Hash digest
SHA256 5c5c726c04df83a7508e660c6e17441fb6b09b19b3a5a8a15d96dec18de27f88
MD5 ff76c3c2a9a4400728ddb15200ca91b9
BLAKE2b-256 85b4d629303e5122605754749e8e1dd31ab39b9064d3d88b5cb4a578906dac33

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page