Skip to main content

A parser for FOL that utilizes NLTK Logic Module to parse and fuzzify FOL rules.

Project description

Logic Tensor Network Implementation (LTN_Imp)

Overview

This repository contains an implementation of Logic Tensor Networks (LTN) for Symbolic Knowledge Injection (SKI) into machine learning models. It is particularly relevant for neuro-symbolic AI, where structured domain knowledge is integrated into data-driven models to enhance interpretability, performance, and robustness.

This implementation has been used in the research paper:

Neuro-symbolic AI for Supporting Chronic Disease Diagnosis and Monitoring
Matteo Magnini, Giovanni Ciatto, Ahmet Emre Kuru, Christel Sirocchi, and Sara Montagna

For full reproducibility of the results reported in the paper, please refer to the commit:

Commit d363aff892f417d770b86253aa8011c4754e8987

Here, on the main branch, you can find the maintained and refactored codebase. Experiments on the Pima Indians Diabetes dataset are provided in the examples/medical/diabetes/demo.ipynb notebook.

Results summary of model performance against different data perturbations

image

Each dot represents the average performance of the model with 30 seeds on the same experiment configuration.

Features

  • LTN-based Knowledge Injection: Implements logic-based constraints in neural networks.
  • Medical AI Applications: Evaluates SKI on medical datasets, focusing on chronic disease prediction.
  • Performance & Robustness Analysis: Compares LTN with classical ML models in terms of accuracy, recall, and adherence to clinical guidelines.
  • Reproducible Experiments: Code used to generate results in the referenced paper.

Requirements

To set up the environment, install the necessary dependencies using Poetry:

poetry install

Usage

Running Experiments

To reproduce the experiments from the referenced paper, use the provided Jupyter Notebook:

examples/medical/diabetes/demo.ipynb

This notebook walks through the process of training and evaluating the LTN model, comparing it against classical ML baselines while considering:

  • Predictive performance (Accuracy, Recall, F1-score, MCC)
  • Logical adherence to clinical rules
  • Robustness under data perturbation (noise injection, missing data, label flipping)

Datasets

The primary dataset used in the experiments is:

  • Pima Indians Diabetes (PID) dataset: A benchmark dataset for diabetes prediction.

Model Architectures

The following models are implemented and compared:

  • Classic ML Models:
    • K-Nearest Neighbors (KNN)
    • Decision Tree (DT)
    • Random Forest (RF)
    • Logistic Regression (LR)
    • Multi-Layer Perceptron (MLP)
  • Logic Tensor Network (LTN): Incorporates symbolic knowledge via logical constraints.

Results Summary

The key findings from the experiments include:

  • LTN improves recall and balanced accuracy, making it more suitable for medical AI applications where false negatives must be minimized.
  • LTN exhibits higher adherence to clinical rules, ensuring predictions align with medical guidelines.
  • LTN demonstrates robustness to data perturbations, maintaining stable performance even with missing or noisy data.

For detailed results, refer to the paper.

Acknowledgments

Special thanks to the authors of the referenced paper and the contributors of the LTN_Imp repository.

License

This repository is released under the Apache 2.0 license.


For any issues or questions, feel free to open an issue.

Project structure

Overview:

<root directory>
├── ltn_imp/             # main package (should be named after your project)   ├── __init__.py         # python package marker   └── __main__.py         # application entry point   └── fuzzy_operators     # folder for all of the fuzzy operators   └── parsing             # folder for the parser utilizing NLTK Logic and all the needed files   
├── test/                   # test package contains unit tests   ├── parsing_tests       # folder for unit tests for parser   └── learning_tests      # folder for unit tests for fuzzy operators in optimization   
├── .github/                # configuration of GitHub CI   └── workflows/          # configuration of GitHub Workflows       └──  check.yml       # runs tests on multiple OS and versions of Python
│
├── LICENSE                 # license file (Apache 2.0 by default)
├── pyproject.toml          # declares build dependencies
└── poetry.toml             # Poetry settings

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ekuru_ltn-3.2.0.tar.gz (22.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

ekuru_ltn-3.2.0-py3-none-any.whl (25.5 kB view details)

Uploaded Python 3

File details

Details for the file ekuru_ltn-3.2.0.tar.gz.

File metadata

  • Download URL: ekuru_ltn-3.2.0.tar.gz
  • Upload date:
  • Size: 22.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/2.1.1 CPython/3.11.5 Darwin/24.3.0

File hashes

Hashes for ekuru_ltn-3.2.0.tar.gz
Algorithm Hash digest
SHA256 dc3f2a8bc1a716b927f51ef13b66832458099968514702fc2b8da77f31fb7dfc
MD5 dac586ac0c1ad2b4c9186b34364a21bb
BLAKE2b-256 13928fa48ec9fac0c56e014c4d6f355e1966e9579468c4d4cea04b368f8d1057

See more details on using hashes here.

File details

Details for the file ekuru_ltn-3.2.0-py3-none-any.whl.

File metadata

  • Download URL: ekuru_ltn-3.2.0-py3-none-any.whl
  • Upload date:
  • Size: 25.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/2.1.1 CPython/3.11.5 Darwin/24.3.0

File hashes

Hashes for ekuru_ltn-3.2.0-py3-none-any.whl
Algorithm Hash digest
SHA256 c887ac80ca87169a8242c10b024c9d89da1d51b3aed2b5570bc6f2f46cc09385
MD5 9fe7c8ca7030ab2cd847badab109abc2
BLAKE2b-256 f0ac2e9ca71f354f05449b9d758cee1522f1608220157bc49201c06df8973db2

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page