Skip to main content

Evaluate atom-mapping equivalence of chemical reactions using graph isomorphism.

Project description

atommap_eval

Evaluate the equivalence of atom-mapped reaction SMILES using graph-based isomorphism.

Overview

atommap_eval is a Python package for comparing two atom-mapped reactions and determining whether they are chemically equivalent, using their graph (networkx) representation and RDKit.

How it works:

  • Optional preprocessing: "Canonicalization" and standardization of reaction SMILES to ensure all reactions are in the right format.
  • Reactions graphs construction with atom-level / bond-level attributes and mapping
  • Graph isomorphism checks using networkx.is_isomorphic()

It allows consistent evaluation of atom-mapping validity (e.g. against a ground truth atom-mapped reaction) by taking into account equivalence of some atoms (i.e. all CH3 in t-Bu are equivalent, any shuffling of atom-map indices should not impact correctness of the mapping)

Warning: tautomeric mappings are not considered equivalent even though from a chemist's perspective they are. Because template extraction of the underlying reactivity would yield different results. Flags for tautomers will however be implemented in further implementations to better deal with this specific case.

By default, if the isomorphism takes more than 10 seconds, it is interrupted and returns None with status "timeout".

Next steps:

  • clean preprocessing implementation
  • test CLI for >1.0.0
  • update all tests >1.0.0
  • define clearly how evaluation needs to be considered and what are edge cases examples

Installation

Quick install for users (pip)

pip install atommap-eval

For developers

# Clone the repo and install in editable mode
git clone https://github.com/yvsgrndjn/atommap_eval.git
cd atommap_eval
pip install -e ".[dev]"

or in case you want to create a new environment with Conda:

conda create -n atommap_eval python=3.9 -c conda-forge rdkit
conda activate atommap_eval
pip install -e ".[dev]"

Usage

Preprocessing

Preprocessing helps format atom-mapped reactions for a fair evaluation. It is split in 2 parts:

  • canonicalization + sanitization : sorts reaction SMILES and atom-mapping indices deterministically. Sanitizes reactions. Returns None if one of the steps fails (associated with flags A, B, C, S )
  • Format analyis : raises specific flags (D) if preprocessing worked but the reaction format will lead to a negative evaluation.

To preprocess data, either use the simple wrapper if it matches your needs:

import atommap_eval.preprocess as preprocess

preprocess_df = preprocess.preprocess_dataset(df, path_to_save)

Python

If you have few examples, use the following:

# simple case
from atommap_eval.evaluator import are_atom_maps_equivalent

gt = "[C:1](=[O:2])[O-:3].[H+:4]>>[C:1](=[O:2])[OH:3]"
pred = "[H+:4].[C:1](=[O:2])[O-:3]>>[C:1](=[O:2])[OH:3]"
result = are_atom_maps_equivalent(gt, pred)
print(result) # True

However, if you have more reactions to evaluate, use:

from atommap_eval.pair_evaluation import evaluate_pairs_batched

# `pairs` is either a list of tuples (rxn1, rxn2) or ReactionPair objects from atommap_eval.data_models
# for example if you store reactions in `your_df` under columns "ground_truth_rxn" and "predicted_rxn":
pairs = [
    ReactionPair(row.ground_truth_rxn, row.predicted_rxn)
    for _, row in your_df.iterrows()
]

results = evaluate_pairs_batched(pairs)
# results is a list of tuples (result: bool, status: str) where status can be "ok", "timeout", "error:{e}"

CLI

atommap_eval reactions.csv -f csv -p 4 -o results.csv

Project structure

src/atommap_eval/
├── preprocess.py
├── cli.py
├── data_models.py
├── evaluator.py
├── input_io.py
├── pair_evaluation.py
├── rxn_graph.py
├── rxnmapper_utils.py
tests/

Development

Run tests:

make test

Format code:

make format

Lint:

make lint

Test examples

Unit tests are located under test/ and cover evaluator logic, CLI execution, and multiprocessing correctness.

License

MIT License © 2025 Yves Grandjean

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

atommap_eval-1.4.0.tar.gz (18.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

atommap_eval-1.4.0-py3-none-any.whl (17.2 kB view details)

Uploaded Python 3

File details

Details for the file atommap_eval-1.4.0.tar.gz.

File metadata

  • Download URL: atommap_eval-1.4.0.tar.gz
  • Upload date:
  • Size: 18.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.9.18

File hashes

Hashes for atommap_eval-1.4.0.tar.gz
Algorithm Hash digest
SHA256 3720f43f263db74d65b9921115af0740809502be05b336d86d5dfd2ea6b2e100
MD5 2a350bc9d7b38a88b3d1ad3d8cb44526
BLAKE2b-256 efb5cb87582ce8744d60da79b9440beb3915397cf24ec44394fe95fe91c54887

See more details on using hashes here.

File details

Details for the file atommap_eval-1.4.0-py3-none-any.whl.

File metadata

  • Download URL: atommap_eval-1.4.0-py3-none-any.whl
  • Upload date:
  • Size: 17.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.9.18

File hashes

Hashes for atommap_eval-1.4.0-py3-none-any.whl
Algorithm Hash digest
SHA256 3c9d7c019d89ac0c2c8b57747bf9268698fc4a4b8afc72b024569b0200a42fc9
MD5 2b904328fc7804431c6b46e92831f38c
BLAKE2b-256 dcc0ba6876b74fc241652a55d6b6831dbdaeb7b8f0f9cd07c3e41790f484065b

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page