Skip to main content

AlphaGrammar: Grammar-based molecular representation learning

Project description

AlphaGrammar

AlphaGrammar is a grammar-based molecular representation learning framework. It learns a hyperedge-replacement grammar (HRG) over molecular graphs via Monte Carlo Tree Search (MCMC) with a learned agent.

Installation

pip install -e /path/to/AlphaGrammar_pkg

Or from PyPI (once published):

pip install alphagrammar

Pretrained models

After installation, copy the pretrained model files to the package data directory:

DATA_DIR=$(python -c "import alphagrammar, os; print(os.path.join(os.path.dirname(alphagrammar.__file__), 'data'))")
cp /path/to/AlphaGrammar/ckpts/vocab_epoch5.pkl $DATA_DIR/
cp /path/to/AlphaGrammar/ckpts/best_agent_epoch0_R0.0000.pkl $DATA_DIR/
cp /path/to/AlphaGrammar/GCN/supervised_contextpred.pth $DATA_DIR/

Usage

Command-line interface

# Parse a single SMILES string (rollout mode)
alphagrammar parse "CCO"

# Parse from a file (one SMILES per line)
alphagrammar parse molecules.smi

# Parse with Bolinas parser (10-second timeout per molecule)
alphagrammar parse "CCO" --timeout 10

# Use only top-100 rules from vocab
alphagrammar parse "CCO" --vocab_size 100

Python API

from alphagrammar import MoleculeDataset, _collate_mol_batch, RuleStats
from alphagrammar.grammar_generation import MCMC_sampling
from alphagrammar.agent import Agent
import torch, pickle

# Load pretrained models
with open("data/vocab_epoch5.pkl", "rb") as f:
    rule_stats = pickle.load(f)

agent = Agent(feat_dim=300, hidden_size=256)
agent.load_state_dict(torch.load("data/best_agent_epoch0_R0.0000.pkl"))
agent.eval()

# Build input dataset
dataset = MoleculeDataset(["CCO", "c1ccccc1"], GNN_model_path="data/supervised_contextpred.pth")
batch = [dataset[i] for i in range(len(dataset))]
input_graphs_dict = _collate_mol_batch(batch)

# Run MCMC sampling
results, rules_per_mol, sequential_steps = MCMC_sampling(
    "output_dir", agent, input_graphs_dict, MCMC_size=1, debug=True
)

Package structure

src/alphagrammar/
├── __init__.py          # Public API
├── cli.py               # argparse CLI entry point
├── core.py              # Core functions (bolinas_evaluate, MoleculeDataset, RuleStats, ...)
├── grammar_generation.py# MCMC sampling and grammar generation
├── agent.py             # Neural agent (policy network)
├── hrg_td_parser_undirected.py  # Bolinas HRG parser
├── private/             # Internal hypergraph / grammar data structures
├── fuseprop/            # Molecular fragmentation utilities
├── GCN/                 # Graph neural network feature extraction
├── data/                # Pretrained model files (copy here after install)
└── commands/
    └── parse.py         # 'alphagrammar parse' subcommand

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

alphagrammar-0.1.0.tar.gz (7.4 MB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

alphagrammar-0.1.0-py3-none-any.whl (7.4 MB view details)

Uploaded Python 3

File details

Details for the file alphagrammar-0.1.0.tar.gz.

File metadata

  • Download URL: alphagrammar-0.1.0.tar.gz
  • Upload date:
  • Size: 7.4 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.8.20

File hashes

Hashes for alphagrammar-0.1.0.tar.gz
Algorithm Hash digest
SHA256 298ae73e983af7ad81b1f85952bcac75718fa06385e498d20dcaf7cb9436fadb
MD5 205a669a11db9bcb95e3aab72d171a29
BLAKE2b-256 a2ed409f01153ec9b11181a2f19ea0170263d7784a97d188e72b3466841434fc

See more details on using hashes here.

File details

Details for the file alphagrammar-0.1.0-py3-none-any.whl.

File metadata

  • Download URL: alphagrammar-0.1.0-py3-none-any.whl
  • Upload date:
  • Size: 7.4 MB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.8.20

File hashes

Hashes for alphagrammar-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 05254d9c2c70bb9d1e8a79c7e20cd8b99dbdcc8554b633a8e386734f73e3a312
MD5 ef75014aea10d0ba2ce0e0442078334d
BLAKE2b-256 276687a7dfea6eda3fe77220d483d8b1116e92848360ecafe68a913dcf4c8671

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page