An AI ensemble model for predicting chemical classes
Project description
python-chebifier
An AI ensemble model for predicting chemical classes in the ChEBI ontology.
Installation
# Clone the repository
git clone https://github.com/yourusername/python-chebifier.git
cd python-chebifier
# Install the package
pip install -e .
Some dependencies of chebai-graph cannot be installed automatically. If you want to use Graph Neural Networks, follow
the instructions in the chebai-graph repository.
Usage
Command Line Interface
The package provides a command-line interface (CLI) for making predictions using an ensemble model.
# Get help
python -m chebifier.cli --help
# Make predictions using a configuration file
python -m chebifier.cli predict configs/example_config.yml --smiles "CC(=O)OC1=CC=CC=C1C(=O)O" "C1=CC=C(C=C1)C(=O)O"
# Make predictions using SMILES from a file
python -m chebifier.cli predict configs/example_config.yml --smiles-file smiles.txt
Configuration File
The CLI requires a YAML configuration file that defines the ensemble model. An example can be found in configs/example_config.yml.
The models and other required files are trained / generated by our chebai package. Examples for models can be found on kaggle.
Python API
You can also use the package programmatically:
from chebifier.ensemble.base_ensemble import BaseEnsemble
import yaml
# Load configuration from YAML file
with open('configs/example_config.yml', 'r') as f:
config = yaml.safe_load(f)
# Instantiate ensemble model
ensemble = BaseEnsemble(config)
# Make predictions
smiles_list = ["CC(=O)OC1=CC=CC=C1C(=O)O", "C1=CC=C(C=C1)C(=O)O"]
predictions = ensemble.predict_smiles_list(smiles_list)
# Print results
for smiles, prediction in zip(smiles_list, predictions):
print(f"SMILES: {smiles}")
if prediction:
print(f"Predicted classes: {prediction}")
else:
print("No predictions")
The ensemble
Given a sample (i.e., a SMILES string) and models $m_1, m_2, \ldots, m_n$, the ensemble works as follows:
- Get predictions from each model $m_i$ for the sample.
- For each class $c$, aggregate predictions $p_c^{m_i}$ from all models that made a prediction for that class. The aggregation happens separately for all positive predictions (i.e., $p_c^{m_i} \geq 0.5$) and all negative predictions ($p_c^{m_i} < 0.5$). If the aggregated value is larger for the positive predictions than for the negative predictions, the ensemble makes a positive prediction for class $c$:
$$ \text{ensemble}(c) = \begin{cases} 1 & \text{if } \sum_{i: p_c^{m_i} \geq 0.5} [\text{confidence}c^{m_i} \cdot \text{model_weight}{m_i} \cdot \text{trust}c^{m_i}] > \sum{i: p_c^{m_i} < 0.5} [\text{confidence}c^{m_i} \cdot \text{model_weight}{m_i} \cdot \text{trust}_c^{m_i}] \ 0 & \text{otherwise} \end{cases} $$
Here, confidence is the model's (self-reported) confidence in its prediction, calculated as $$ \text{confidence}_c^{m_i} = 2|p_c^{m_i} - 0.5| $$ For example, if a model makes a positive prediction with $p_c^{m_i} = 0.55$, the confidence is $2|0.55 - 0.5| = 0.1$. One could say that the model is not very confident in its prediction and very close to switching to a negative prediction. If another model is very sure about its negative prediction with $p_c^{m_j} = 0.1$, the confidence is $2|0.1 - 0.5| = 0.8$. Therefore, if in doubt, we are more confident in the negative prediction.
Confidence can be disabled by the use_confidence parameter of the predict method (default: True).
The model_weight can be set for each model in the configuration file (default: 1). This is used to favor a certain
model independently of a given class.
Trust is based on the model's performance on a validation set. After training, we evaluate the Machine Learning models
on a validation set for each class. If the ensemble_type is set to wmv-f1, the trust is calculated as 1 + the F1 score.
If the ensemble_type is set to mv (the default), the trust is set to 1 for all models.
- After a decision has been made for each class independently, the consistency of the predictions with regard to the ChEBI hierarchy and disjointness axioms is checked. This is done in 3 steps:
- (1) First, the hierarchy is corrected. For each pair of classes $A$ and $B$ where $A$ is a subclass of $B$ (following the is-a relation in ChEBI), we set the ensemble prediction of $B$ to 1 if the prediction of $A$ is 1. Intuitively speaking, if we have determined that a molecule belongs to a specific class (e.g., aromatic primary alcohol), it also belongs to the direct and indirect superclasses (e.g., primary alcohol, aromatic alcohol, alcohol).
- (2) Next, we check for disjointness. This is not specified directly in ChEBI, but in an additional ChEBI module (chebi-disjoints.owl).
We have extracted these disjointness axioms into a CSV file and added some more disjointness axioms ourselves (see
data>disjoint_chebi.csvanddata>disjoint_additional.csv). If two classes $A$ and $B$ are disjoint and we predict both, we select one of them randomly and set the other to 0. - (3) Since the second step might have introduced new inconsistencies into the hierarchy, we repeat the first step, but with a small change. For a pair of classes $A \subseteq B$ with predictions $1$ and $0$, instead of setting $B$ to $1$, we now set $A$ to $0$. This has the advantage that we cannot introduce new disjointness-inconsistencies and don't have to repeat step 2.
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file chebifier-1.0.0.tar.gz.
File metadata
- Download URL: chebifier-1.0.0.tar.gz
- Upload date:
- Size: 52.1 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.12.9
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
7f8c5c05e9c759125be693c515d0af96a63491b2564796ce0ad5e01e6a8f5a41
|
|
| MD5 |
8fff96ca55bbb0f48d105084e38915b9
|
|
| BLAKE2b-256 |
81843888f87739f04229eb0aa1c8da9cbe0b1bee12d01ca4c43b02cabc9df888
|
Provenance
The following attestation bundles were made for chebifier-1.0.0.tar.gz:
Publisher:
python-publish.yml on ChEB-AI/python-chebifier
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
chebifier-1.0.0.tar.gz -
Subject digest:
7f8c5c05e9c759125be693c515d0af96a63491b2564796ce0ad5e01e6a8f5a41 - Sigstore transparency entry: 270237757
- Sigstore integration time:
-
Permalink:
ChEB-AI/python-chebifier@2c724e7d82e8fb9223437682dda1f879f384f56c -
Branch / Tag:
refs/tags/v1.0.0 - Owner: https://github.com/ChEB-AI
-
Access:
private
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
python-publish.yml@2c724e7d82e8fb9223437682dda1f879f384f56c -
Trigger Event:
release
-
Statement type:
File details
Details for the file chebifier-1.0.0-py3-none-any.whl.
File metadata
- Download URL: chebifier-1.0.0-py3-none-any.whl
- Upload date:
- Size: 41.3 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.12.9
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
f8028f2c961411ed7b2e5d0ee223cd3ea014c66c4b814b271fea6041422f68c2
|
|
| MD5 |
89c53cdc24a166bd0c2f503fc2f77c10
|
|
| BLAKE2b-256 |
fa79384a1aea1a25fadc3cd917306a3ae2bd7fc770ebd6ad96c5d927e746566e
|
Provenance
The following attestation bundles were made for chebifier-1.0.0-py3-none-any.whl:
Publisher:
python-publish.yml on ChEB-AI/python-chebifier
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
chebifier-1.0.0-py3-none-any.whl -
Subject digest:
f8028f2c961411ed7b2e5d0ee223cd3ea014c66c4b814b271fea6041422f68c2 - Sigstore transparency entry: 270237762
- Sigstore integration time:
-
Permalink:
ChEB-AI/python-chebifier@2c724e7d82e8fb9223437682dda1f879f384f56c -
Branch / Tag:
refs/tags/v1.0.0 - Owner: https://github.com/ChEB-AI
-
Access:
private
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
python-publish.yml@2c724e7d82e8fb9223437682dda1f879f384f56c -
Trigger Event:
release
-
Statement type: