A parser for FOL that utilizes NLTK Logic Module to parse and fuzzify FOL rules.
Project description
Logic Tensor Network Implementation (LTN_Imp)
Overview
This repository contains an implementation of Logic Tensor Networks (LTN) for Symbolic Knowledge Injection (SKI) into machine learning models. It is particularly relevant for neuro-symbolic AI, where structured domain knowledge is integrated into data-driven models to enhance interpretability, performance, and robustness.
This implementation has been used in the research paper:
Neuro-symbolic AI for Supporting Chronic Disease Diagnosis and Monitoring
Matteo Magnini, Giovanni Ciatto, Ahmet Emre Kuru, Christel Sirocchi, and Sara Montagna
For full reproducibility of the results reported in the paper, please refer to the commit:
Here, on the main branch, you can find the maintained and refactored codebase.
Experiments on the Pima Indians Diabetes dataset are provided in the examples/medical/diabetes/demo.ipynb notebook.
Results summary of model performance against different data perturbations
Each dot represents the average performance of the model with 30 seeds on the same experiment configuration.
Features
- LTN-based Knowledge Injection: Implements logic-based constraints in neural networks.
- Medical AI Applications: Evaluates SKI on medical datasets, focusing on chronic disease prediction.
- Performance & Robustness Analysis: Compares LTN with classical ML models in terms of accuracy, recall, and adherence to clinical guidelines.
- Reproducible Experiments: Code used to generate results in the referenced paper.
Requirements
To set up the environment, install the necessary dependencies using Poetry:
poetry install
Usage
Running Experiments
To reproduce the experiments from the referenced paper, use the provided Jupyter Notebook:
This notebook walks through the process of training and evaluating the LTN model, comparing it against classical ML baselines while considering:
- Predictive performance (Accuracy, Recall, F1-score, MCC)
- Logical adherence to clinical rules
- Robustness under data perturbation (noise injection, missing data, label flipping)
Datasets
The primary dataset used in the experiments is:
- Pima Indians Diabetes (PID) dataset: A benchmark dataset for diabetes prediction.
Model Architectures
The following models are implemented and compared:
- Classic ML Models:
- K-Nearest Neighbors (KNN)
- Decision Tree (DT)
- Random Forest (RF)
- Logistic Regression (LR)
- Multi-Layer Perceptron (MLP)
- Logic Tensor Network (LTN): Incorporates symbolic knowledge via logical constraints.
Results Summary
The key findings from the experiments include:
- LTN improves recall and balanced accuracy, making it more suitable for medical AI applications where false negatives must be minimized.
- LTN exhibits higher adherence to clinical rules, ensuring predictions align with medical guidelines.
- LTN demonstrates robustness to data perturbations, maintaining stable performance even with missing or noisy data.
For detailed results, refer to the paper.
Acknowledgments
Special thanks to the authors of the referenced paper and the contributors of the LTN_Imp repository.
License
This repository is released under the Apache 2.0 license.
For any issues or questions, feel free to open an issue.
Project structure
Overview:
<root directory>
├── ltn_imp/ # main package (should be named after your project)
│ ├── __init__.py # python package marker
│ └── __main__.py # application entry point
│ └── fuzzy_operators # folder for all of the fuzzy operators
│ └── parsing # folder for the parser utilizing NLTK Logic and all the needed files
│
├── test/ # test package contains unit tests
│ ├── parsing_tests # folder for unit tests for parser
│ └── learning_tests # folder for unit tests for fuzzy operators in optimization
│
├── .github/ # configuration of GitHub CI
│ └── workflows/ # configuration of GitHub Workflows
│ └── check.yml # runs tests on multiple OS and versions of Python
│
├── LICENSE # license file (Apache 2.0 by default)
├── pyproject.toml # declares build dependencies
└── poetry.toml # Poetry settings
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file ekuru_ltn-3.1.0.tar.gz.
File metadata
- Download URL: ekuru_ltn-3.1.0.tar.gz
- Upload date:
- Size: 22.2 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/2.1.1 CPython/3.11.5 Darwin/24.3.0
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
65fe11c29d2fe08f694d7e9f03a94df30c5ebe27faa8fd4f1d61d449335f7787
|
|
| MD5 |
0c002c2dfca68002be557722b6caac9d
|
|
| BLAKE2b-256 |
7cd443799c9ef0b02aa1435e04c695ac02870084f3b3f9e5bf486b4769ec4a24
|
File details
Details for the file ekuru_ltn-3.1.0-py3-none-any.whl.
File metadata
- Download URL: ekuru_ltn-3.1.0-py3-none-any.whl
- Upload date:
- Size: 25.4 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/2.1.1 CPython/3.11.5 Darwin/24.3.0
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
9711fe04de463f3c96f67c1cda92d65014f44a5fc263460b23fb5fb5c60b528d
|
|
| MD5 |
eb59b1e397dacee4edd5e9cf5f1953d7
|
|
| BLAKE2b-256 |
836660bec07ede75ab30496a0ec05dcc5ff6528aa6e79d0bbd90d6913f6b20b8
|