No project description provided
Project description
RingSAGE: Cycle-Aware Molecular Regression with Virtual Nodes
RingSAGE is a novel Graph Neural Network (GNN) architecture designed for cycle detection and molecular property prediction. By incorporating virtual nodes into the message-passing framework, RingSAGE can capture crucial cyclic information in molecules, improving the performance on tasks such as molecular regression and classification.
Features
- Automatically detects minimal cycles in molecular graphs using NetworkX.
- Incorporates virtual nodes (via RingSAGE layers) to effectively enhance node representations.
- Provides a consistent interface for training, validation, and testing in PyTorch Geometric.
- Supports multiple GNN backbones, including GCN, GIN, GAT, and (by default) RingSAGE.
- Handles both graph classification and regression tasks.
Installation
To install RingSAGE, simply run:
pip install ringsage
Quick Start
Below is a minimal example of how you might use RingSAGE within a training script:
import torch
from torch_geometric.datasets import ZINC
from torch_geometric.loader import DataLoader
from ringsage.model import MoleculeRegressor
from ringsage.types import GNN, Task, Optimizer, Scheduler
from ringsage.schemas import ModelConfig
from ringsage.train import train
from ringsage.utils import cycle_collate_fn
train_dataset = ZINC("", True, split = 'train')
val_dataset = ZINC("", True, split = 'val')
test_dataset = ZINC("", True, split = 'test')
train_loader = DataLoader(train_dataset, batch_size=128, collate_fn=cycle_collate_fn)
val_loader = DataLoader(val_dataset, batch_size=128, collate_fn=cycle_collate_fn)
test_loader = DataLoader(test_dataset, batch_size=128, collate_fn=cycle_collate_fn)
config = ModelConfig(
task_type = Task.GRAPH_REGRESSION,
num_features = train_dataset.num_node_features,
hidden_channels = 128,
num_classes = 1,
gnn_depth = 4,
gnn_module = GNN.RINGSAGE,
scheduler = Scheduler.COSINE,
optimizer = Optimizer.ADAM,
num_edge_features = train_dataset.num_edge_features,
num_node_features = train_dataset.num_node_features
)
model = MoleculeRegressor(config)
model_params = sum(p.numel() for p in model.parameters())
print(f"Number of parameters: {model_params}")
optimizer = model.configure_optimizers(lr=1e-3)
scheduler = model.configure_schedulers(optimizer, T_max=100)
report = train(
model,
train_loader,
val_loader,
optimizer,
scheduler,
epochs=10,
device="cuda" if torch.cuda.is_available() else "cpu",
val_metric="regression"
)
print("Training complete! Final report:", report)
Contributing
We welcome contributions! To contribute, please clone the repository and open a pull request with any improvements or fixes.
License
This project is licensed under the Apache License, Version 2.0. See the LICENSE file for details.
Citation
If you use RingSAGE in a research work, please cite the repository and any relevant papers. We appreciate your support and contribution to open-source software.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file ringsage-0.1.1.tar.gz.
File metadata
- Download URL: ringsage-0.1.1.tar.gz
- Upload date:
- Size: 14.0 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.8.2 CPython/3.10.12 Linux/6.9.3-76060903-generic
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
9fc63803e74f9f256019df899d382634aa88d53a6fad7b37a38f7d45959de667
|
|
| MD5 |
527d176d1218e66104d852514fa985cc
|
|
| BLAKE2b-256 |
e76038f53d85ab80bfc0596fdacfea0b0200f4fd7d381f0976b2efc373d37216
|
File details
Details for the file ringsage-0.1.1-py3-none-any.whl.
File metadata
- Download URL: ringsage-0.1.1-py3-none-any.whl
- Upload date:
- Size: 15.7 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.8.2 CPython/3.10.12 Linux/6.9.3-76060903-generic
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
990f31a883e9234857ebc3080ee6bfc2847f3bd76435b01c7e096858a2e95555
|
|
| MD5 |
dec16e3217781eb9b542d835897adedf
|
|
| BLAKE2b-256 |
4c244c91f03e91eafc0392489bb7ebf5862001941b6212c81407765ad08eb83c
|