...
Project description
AnyPINN
Work in Progress — This project is under active development and APIs may change. If you run into any issues, please open an issue on GitHub.
A modular Python library for solving differential equations with Physics-Informed Neural Networks.
AnyPINN lets you go from zero to a running PINN experiment in seconds, or give you the full control to define custom physics, constraints, and training loops. You decide how deep to go.
🚀 Quick Start
The fastest way to start is the bootstrap CLI. It scaffolds a complete, runnable project interactively. Run it with uvx (ships with uv):
uvx anypinn create my-project
or with pipx:
pipx run anypinn create my-project
? Choose a starting point:
> SIR Epidemic Model
...
Custom ODE
Blank project
? Select training data source:
> Generate synthetic data
Load from CSV
? Include Lightning training wrapper? (Y/n)
Creating my-project/
✓ pyproject.toml project metadata & dependencies
✓ ode.py your ODE definition
✓ config.py hyperparameters with sensible defaults
✓ train.py ready-to-run training script
✓ data/ data directory
Done! Run: cd my-project && uv sync && uv run train.py
All prompts are also available as flags to skip the interactive flow:
anypinn create my-project \
--template sir \
--data synthetic \
--lightning
| Flag | Values | Description |
|---|---|---|
--help, -h |
— | Show help and exit |
--list-templates, -l |
— | Print all templates with descriptions and exit |
--template, -t |
built-in template name, custom, or blank |
Starting template |
--data, -d |
synthetic, csv |
Training data source |
--lightning, -L |
— | Include PyTorch Lightning wrapper |
--no-lightning, -NL |
— | Exclude PyTorch Lightning wrapper |
👥 Who Is This For?
AnyPINN is built around progressive complexity. Start simple, go deeper only when you need to.
| User | Goal | How |
|---|---|---|
| Experimenter | Run a known problem, tweak parameters, see results | Pick a built-in template, change config, press start |
| Researcher | Define new physics or custom constraints | Subclass Constraint and Problem, use the provided training engine |
| Framework builder | Custom training loops, novel architectures | Use anypinn.core directly — zero Lightning required |
💡 Examples
The examples/ directory has ready-made, self-contained scripts covering epidemic models, oscillators, predator-prey dynamics, and more — from a minimal ~80-line core-only script to full Lightning stacks. They're a great source of inspiration when defining your own problem.
🔬 Defining Your Own Problem
If you want to go beyond the built-in templates, here is the full workflow for defining a custom ODE inverse problem.
1: Define the ODE
Implement a function matching the ODECallable protocol:
from torch import Tensor
from anypinn.core import ArgsRegistry
def my_ode(x: Tensor, y: Tensor, args: ArgsRegistry) -> Tensor:
"""Return dy/dx given current state y and position x."""
k = args["k"](x) # learnable or fixed parameter
return -k * y # simple exponential decay
2: Configure hyperparameters
from dataclasses import dataclass
from anypinn.problems import ODEHyperparameters
@dataclass(frozen=True, kw_only=True)
class MyHyperparameters(ODEHyperparameters):
pde_weight: float = 1.0
ic_weight: float = 10.0
data_weight: float = 5.0
3: Build the problem
from anypinn.problems import ODEInverseProblem, ODEProperties
props = ODEProperties(ode=my_ode, args={"k": param}, y0=y0)
problem = ODEInverseProblem(
ode_props=props,
fields={"u": field},
params={"k": param},
hp=hp,
)
4: Train
import pytorch_lightning as pl
from anypinn.lightning import PINNModule
# With Lightning (batteries included)
module = PINNModule(problem, hp)
trainer = pl.Trainer(max_epochs=50_000)
trainer.fit(module, datamodule=dm)
# Or with your own training loop (core only, no Lightning)
optimizer = torch.optim.Adam(problem.parameters(), lr=1e-3)
for batch in dataloader:
optimizer.zero_grad()
loss = problem.training_loss(batch, log=my_log_fn)
loss.backward()
optimizer.step()
🏗️ Architecture
AnyPINN is split into four layers with a strict dependency direction — outer layers depend on inner ones, never the reverse.
graph TD
EXP["Your Experiment / Generated Project"]
EXP --> CAT
EXP --> LIT
subgraph CAT["anypinn.catalog"]
direction LR
CA1[SIR / SEIR]
CA2[DampedOscillator]
CA3[LotkaVolterra]
end
subgraph LIT["anypinn.lightning (optional)"]
direction LR
L1[PINNModule]
L2[Callbacks]
L3[PINNDataModule]
end
subgraph PROB["anypinn.problems"]
direction LR
P1[ResidualsConstraint]
P2[ICConstraint]
P3[DataConstraint]
P4[ODEInverseProblem]
end
subgraph CORE["anypinn.core (standalone · pure PyTorch)"]
direction LR
C1[Problem · Constraint]
C2[Field · Parameter]
C3[Config · Context]
end
CAT -->|depends on| PROB
CAT -->|depends on| CORE
LIT -->|depends on| CORE
PROB -->|depends on| CORE
anypinn.core — The Math Layer
Pure PyTorch. Defines what a PINN problem is, with no opinions about training.
Problem— Aggregates constraints, fields, and parameters. Providestraining_loss()andpredict().Constraint(ABC) — A single loss term. Subclass it to express any physics equation, boundary condition, or data-matching objective.Field— MLP mapping input coordinates to state variables (e.g.,t → [S, I, R]).Parameter— Learnable scalar or function-valued parameter (e.g.,βin SIR).InferredContext— Runtime domain bounds and validation references, extracted from data and injected into constraints automatically.
anypinn.lightning — The Training Engine (optional)
A thin wrapper plugging a Problem into PyTorch Lightning:
PINNModule—LightningModulewrapping anyProblem. Handles optimizer setup, context injection, and prediction.PINNDataModule— Abstract data module managing loading, config-driven collocation sampling, and context creation. Collocation strategy is selected viaTrainingDataConfig.collocation_sampler("random","uniform","latin_hypercube","log_uniform_1d", or"adaptive").- Callbacks — SMMA-based early stopping, formatted progress bars, data scaling, prediction writers.
anypinn.problems — ODE Building Blocks
Ready-made constraints for ODE inverse problems:
ResidualsConstraint—‖dy/dt − f(t, y)‖²via autogradICConstraint—‖y(t₀) − y₀‖²DataConstraint—‖prediction − observed data‖²ODEInverseProblem— Composes all three with configurable weights
anypinn.catalog — Problem-Specific Building Blocks
Drop-in ODE functions and DataModules for specific systems. See anypinn/catalog/ for the full list.
🛠️ Tooling
| Tool | Purpose |
|---|---|
| uv | Dependency management |
| just | Task automation |
| Ruff | Linting and formatting |
| pytest | Testing |
| ty | Type checking |
All common tasks (test, lint, format, type-check, docs) are available via just.
devenv users: devenv redirects
uv syncinstalls to.devenv/state/venvinstead of the standard.venv, so ty cannot auto-discover it. Create a gitignoredty.tomlat the project root with:[environment] python-version = "3.13" python = "./.devenv/state/venv" root = ["./src"](
ty.tomltakes full precedence overpyproject.toml, so all three settings are required.)
🤝 Contributing
See CONTRIBUTING.md for setup instructions, code style guidelines, and the pull request workflow.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file anypinn-0.11.0.tar.gz.
File metadata
- Download URL: anypinn-0.11.0.tar.gz
- Upload date:
- Size: 11.4 MB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
b6a72d1da733af7f3765272a93fc00e3811fb7b2c2e6ac377cc74b5434143cff
|
|
| MD5 |
c9823a6edbc94b92a5a188703a647895
|
|
| BLAKE2b-256 |
44efe1fd6a82b051e2cd8f62bb7dd8cb9b63bf73531fa0a1a17bb74e6b2e286f
|
Provenance
The following attestation bundles were made for anypinn-0.11.0.tar.gz:
Publisher:
release.yaml on giacomoguidotto/anypinn
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
anypinn-0.11.0.tar.gz -
Subject digest:
b6a72d1da733af7f3765272a93fc00e3811fb7b2c2e6ac377cc74b5434143cff - Sigstore transparency entry: 1003844773
- Sigstore integration time:
-
Permalink:
giacomoguidotto/anypinn@df37b8645a1affec11d3793eca7e78639162a7db -
Branch / Tag:
refs/heads/main - Owner: https://github.com/giacomoguidotto
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
release.yaml@df37b8645a1affec11d3793eca7e78639162a7db -
Trigger Event:
push
-
Statement type:
File details
Details for the file anypinn-0.11.0-py3-none-any.whl.
File metadata
- Download URL: anypinn-0.11.0-py3-none-any.whl
- Upload date:
- Size: 85.4 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
c2b7d98b9227b4c00659ee6441062b62c9732496edeb97c281e2761bb6a3c48e
|
|
| MD5 |
629a6a407bd9f215e6dcb31d27e02b8a
|
|
| BLAKE2b-256 |
60c3629ba90ab400d74918ee55e898551a96efdfcfbf3d9a008232318fef855e
|
Provenance
The following attestation bundles were made for anypinn-0.11.0-py3-none-any.whl:
Publisher:
release.yaml on giacomoguidotto/anypinn
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
anypinn-0.11.0-py3-none-any.whl -
Subject digest:
c2b7d98b9227b4c00659ee6441062b62c9732496edeb97c281e2761bb6a3c48e - Sigstore transparency entry: 1003844799
- Sigstore integration time:
-
Permalink:
giacomoguidotto/anypinn@df37b8645a1affec11d3793eca7e78639162a7db -
Branch / Tag:
refs/heads/main - Owner: https://github.com/giacomoguidotto
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
release.yaml@df37b8645a1affec11d3793eca7e78639162a7db -
Trigger Event:
push
-
Statement type: