Skip to main content

...

Project description

AnyPINN

CI uv Ruff Type checked with ty

Work in Progress — This project is under active development and APIs may change. If you run into any issues, please open an issue on GitHub.

A modular Python library for solving differential equations with Physics-Informed Neural Networks.

AnyPINN lets you go from zero to a running PINN experiment in seconds, or give you the full control to define custom physics, constraints, and training loops. You decide how deep to go.

🚀 Quick Start

The fastest way to start is the bootstrap CLI. It scaffolds a complete, runnable project interactively. Run it with uvx (ships with uv):

uvx anypinn create my-project

or with pipx:

pipx run anypinn create my-project
? Choose a starting point:
  > SIR Epidemic Model
    ...
    Custom ODE
    Blank project

? Select training data source:
  > Generate synthetic data
    Load from CSV

? Include Lightning training wrapper? (Y/n)

Creating my-project/
  ✓  pyproject.toml   project metadata & dependencies
  ✓  ode.py           your ODE definition
  ✓  config.py        hyperparameters with sensible defaults
  ✓  train.py         ready-to-run training script
  ✓  data/            data directory

  Done! Run:  cd my-project && uv sync && uv run train.py

All prompts are also available as flags to skip the interactive flow:

anypinn create my-project \
  --template sir \
  --data synthetic \
  --lightning
Flag Values Description
--help, -h Show help and exit
--list-templates, -l Print all templates with descriptions and exit
--template, -t built-in template name, custom, or blank Starting template
--data, -d synthetic, csv Training data source
--lightning, -L Include PyTorch Lightning wrapper
--no-lightning, -NL Exclude PyTorch Lightning wrapper

👥 Who Is This For?

AnyPINN is built around progressive complexity. Start simple, go deeper only when you need to.

User Goal How
Experimenter Run a known problem, tweak parameters, see results Pick a built-in template, change config, press start
Researcher Define new physics or custom constraints Subclass Constraint and Problem, use the provided training engine
Framework builder Custom training loops, novel architectures Use anypinn.core directly — zero Lightning required

💡 Examples

The examples/ directory has ready-made, self-contained scripts covering epidemic models, oscillators, predator-prey dynamics, and more — from a minimal ~80-line core-only script to full Lightning stacks. They're a great source of inspiration when defining your own problem.

🔬 Defining Your Own Problem

If you want to go beyond the built-in templates, here is the full workflow for defining a custom ODE inverse problem.

1: Define the ODE

Implement a function matching the ODECallable protocol:

from torch import Tensor
from anypinn.core import ArgsRegistry

def my_ode(x: Tensor, y: Tensor, args: ArgsRegistry) -> Tensor:
    """Return dy/dx given current state y and position x."""
    k = args["k"](x)        # learnable or fixed parameter
    return -k * y           # simple exponential decay

2: Configure hyperparameters

from dataclasses import dataclass
from anypinn.problems import ODEHyperparameters

@dataclass(frozen=True, kw_only=True)
class MyHyperparameters(ODEHyperparameters):
    pde_weight: float = 1.0
    ic_weight: float = 10.0
    data_weight: float = 5.0

3: Build the problem

from anypinn.problems import ODEInverseProblem, ODEProperties

props = ODEProperties(ode=my_ode, args={"k": param}, y0=y0)
problem = ODEInverseProblem(
    ode_props=props,
    fields={"u": field},
    params={"k": param},
    hp=hp,
)

4: Train

import pytorch_lightning as pl
from anypinn.lightning import PINNModule

# With Lightning (batteries included)
module = PINNModule(problem, hp)
trainer = pl.Trainer(max_epochs=50_000)
trainer.fit(module, datamodule=dm)

# Or with your own training loop (core only, no Lightning)
optimizer = torch.optim.Adam(problem.parameters(), lr=1e-3)
for batch in dataloader:
    optimizer.zero_grad()
    loss = problem.training_loss(batch, log=my_log_fn)
    loss.backward()
    optimizer.step()

🏗️ Architecture

AnyPINN is split into four layers with a strict dependency direction — outer layers depend on inner ones, never the reverse.

graph TD
    EXP["Your Experiment / Generated Project"]

    EXP --> CAT
    EXP --> LIT

    subgraph CAT["anypinn.catalog"]
        direction LR
        CA1[SIR / SEIR]
        CA2[DampedOscillator]
        CA3[LotkaVolterra]
    end

    subgraph LIT["anypinn.lightning (optional)"]
        direction LR
        L1[PINNModule]
        L2[Callbacks]
        L3[PINNDataModule]
    end

    subgraph PROB["anypinn.problems"]
        direction LR
        P1[ResidualsConstraint]
        P2[ICConstraint]
        P3[DataConstraint]
        P4[ODEInverseProblem]
    end

    subgraph CORE["anypinn.core (standalone · pure PyTorch)"]
        direction LR
        C1[Problem · Constraint]
        C2[Field · Parameter]
        C3[Config · Context]
    end

    CAT -->|depends on| PROB
    CAT -->|depends on| CORE
    LIT -->|depends on| CORE
    PROB -->|depends on| CORE

anypinn.core — The Math Layer

Pure PyTorch. Defines what a PINN problem is, with no opinions about training.

  • Problem — Aggregates constraints, fields, and parameters. Provides training_loss() and predict().
  • Constraint (ABC) — A single loss term. Subclass it to express any physics equation, boundary condition, or data-matching objective.
  • Field — MLP mapping input coordinates to state variables (e.g., t → [S, I, R]).
  • Parameter — Learnable scalar or function-valued parameter (e.g., β in SIR).
  • InferredContext — Runtime domain bounds and validation references, extracted from data and injected into constraints automatically.

anypinn.lightning — The Training Engine (optional)

A thin wrapper plugging a Problem into PyTorch Lightning:

  • PINNModuleLightningModule wrapping any Problem. Handles optimizer setup, context injection, and prediction.
  • PINNDataModule — Abstract data module managing loading, config-driven collocation sampling, and context creation. Collocation strategy is selected via TrainingDataConfig.collocation_sampler ("random", "uniform", "latin_hypercube", "log_uniform_1d", or "adaptive").
  • Callbacks — SMMA-based early stopping, formatted progress bars, data scaling, prediction writers.

anypinn.problems — ODE Building Blocks

Ready-made constraints for ODE inverse problems:

  • ResidualsConstraint‖dy/dt − f(t, y)‖² via autograd
  • ICConstraint‖y(t₀) − y₀‖²
  • DataConstraint‖prediction − observed data‖²
  • ODEInverseProblem — Composes all three with configurable weights

anypinn.catalog — Problem-Specific Building Blocks

Drop-in ODE functions and DataModules for specific systems. See anypinn/catalog/ for the full list.

🛠️ Tooling

Tool Purpose
uv Dependency management
just Task automation
Ruff Linting and formatting
pytest Testing
ty Type checking

All common tasks (test, lint, format, type-check, docs) are available via just.

devenv users: devenv redirects uv sync installs to .devenv/state/venv instead of the standard .venv, so ty cannot auto-discover it. Create a gitignored ty.toml at the project root with:

[environment]
python-version = "3.13"
python = "./.devenv/state/venv"
root = ["./src"]

(ty.toml takes full precedence over pyproject.toml, so all three settings are required.)

🤝 Contributing

See CONTRIBUTING.md for setup instructions, code style guidelines, and the pull request workflow.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

anypinn-0.12.0.tar.gz (14.6 MB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

anypinn-0.12.0-py3-none-any.whl (90.8 kB view details)

Uploaded Python 3

File details

Details for the file anypinn-0.12.0.tar.gz.

File metadata

  • Download URL: anypinn-0.12.0.tar.gz
  • Upload date:
  • Size: 14.6 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for anypinn-0.12.0.tar.gz
Algorithm Hash digest
SHA256 cd8c6e455ddd042a68babcb84f4d4cf0656229723e644bd3b84b09bc8017deaf
MD5 3b76931750e5299ff60cfc9ec3674a3b
BLAKE2b-256 c74e5d6a2bc91f445464853459455112a7867a7c3d5fdd87a635e7162128d4a4

See more details on using hashes here.

Provenance

The following attestation bundles were made for anypinn-0.12.0.tar.gz:

Publisher: release.yaml on giacomoguidotto/anypinn

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file anypinn-0.12.0-py3-none-any.whl.

File metadata

  • Download URL: anypinn-0.12.0-py3-none-any.whl
  • Upload date:
  • Size: 90.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for anypinn-0.12.0-py3-none-any.whl
Algorithm Hash digest
SHA256 cb33248fa760a885562ed28bf7b916c8b85d721934c37108eabdb351e968c502
MD5 d97089742c6c6d635c3f6b676ad77e50
BLAKE2b-256 b3f1af93132515df271835a1434e04387d6832d9b5b0acce21253d431bff6171

See more details on using hashes here.

Provenance

The following attestation bundles were made for anypinn-0.12.0-py3-none-any.whl:

Publisher: release.yaml on giacomoguidotto/anypinn

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page