Skip to main content

Signed Common Meadow arithmetic for stable machine learning

Project description

ZeroProofML

Machine learning that handles division by zero gracefully

Pipeline Coverage Docs

Built on Signed Common Meadow (SCM) semantics for totalized arithmetic

What is ZeroProofML?

ZeroProofML is a PyTorch library that enables neural networks to learn functions with singularities—like 1/x near zero or gravitational potentials—without numerical instabilities or undefined behavior. Instead of treating division by zero as an error, we use Signed Common Meadow (SCM) semantics to provide a mathematically rigorous foundation for arithmetic operations that remain well-defined everywhere.

Key capabilities

  • Totalized arithmetic: Singular operations map to an absorptive bottom element , tracked by explicit masks (and optionally represented as NaN payloads at strict decode time)
  • Smooth training: Optional projective mode with ⟨N,D⟩ tuples and "ghost gradients" keeps optimization stable near singularities
  • Strict inference: Decode projective outputs with configurable thresholds (τ_infer, τ_train) and obtain bottom_mask / gap_mask for rejection or safe fallbacks
  • Orientation tracking: Weak-sign protocol provides direction/orientation information near singularities
  • PyTorch integration: Drop-in rational layers, SCM-aware losses, and JIT-compatible operations

When to use ZeroProofML

ZeroProofML excels in domains where singularities arise naturally:

  • Physics: Gravitational/electrostatic potentials (1/r), collision dynamics, quantum mechanics
  • Robotics: Inverse kinematics with joint limits, collision avoidance fields
  • Computer vision: Homogeneous coordinates, projective geometry, structure-from-motion
  • Scientific computing: Learning differential equations with singular solutions, rational approximations

Quick start

# install with a backend (PyPI)
pip install "zeroproofml[torch]"

# optional plotting/logging utilities
pip install "zeroproofml[viz]"

# from a repo checkout (development)
# pip install -e ".[dev,torch]"
import torch
from torch.utils.data import DataLoader, TensorDataset

from zeroproof.inference import InferenceConfig, SCMInferenceWrapper
from zeroproof.layers.projective_rational import ProjectiveRRModelConfig, RRProjectiveRationalModel
from zeroproof.losses.implicit import implicit_loss
from zeroproof.training import SCMTrainer, TrainingConfig

# Toy example: learn a 1D rational function (targets may include inf/NaN for singular labels).
x = torch.linspace(-1.0, 1.0, 2048).unsqueeze(-1)
y = 1.0 / (x + 0.1)
train_loader = DataLoader(TensorDataset(x, y), batch_size=256, shuffle=True)

model = RRProjectiveRationalModel(
    ProjectiveRRModelConfig(input_dim=1, output_dim=1, numerator_degree=3, denominator_degree=2)
)
wrapped = SCMInferenceWrapper(model, config=InferenceConfig(tau_infer=1e-6, tau_train=1e-4))

def loss_fn(outputs, lifted_targets):
    P, Q = outputs
    Y_n, Y_d = lifted_targets
    return implicit_loss(P, Q, Y_n, Y_d)

optimizer = torch.optim.AdamW(model.parameters(), lr=1e-3)
trainer = SCMTrainer(
    model=model,
    optimizer=optimizer,
    loss_fn=loss_fn,
    train_loader=train_loader,
    config=TrainingConfig(max_epochs=20),
)
trainer.fit()

# Strict inference (decoded uses NaN for ⊥ payloads; masks are authoritative).
wrapped.eval()
decoded, bottom_mask, gap_mask = wrapped(x)

See examples/ and docs/ (start at docs/index.md) for end-to-end demos and recommended patterns.

How it works

ZeroProofML implements a two-phase training paradigm:

  1. Training mode (smooth/projective): Networks learn rational functions as projective tuples ⟨N,D⟩ with detached normalization, allowing smooth gradients even when denominators approach zero
  2. Inference mode (strict SCM): Outputs are decoded with configurable thresholds; small denominators trigger and are surfaced via bottom_mask / gap_mask (decoded payloads use NaN for )

This approach combines the stability of smooth optimization with the rigor of totalized arithmetic, enabling reliable deployment in safety-critical domains.

Architecture

zeroproof/
├── scm/          # SCM values + totalized ops (⊥ semantics)
├── autodiff/     # Gradient policies + projective helpers
├── layers/       # PyTorch SCM layers (rational heads, normalization)
├── losses/       # SCM-aware losses (fit/margin/sign/rejection)
├── training/     # Trainer loop + target lifting utilities
├── inference/    # Export & deployment helpers
├── metrics/      # Pole & singularity metrics
└── utils/        # IEEE↔SCM bridge, visualization

Docs hub: docs/index.md | Getting started: docs/00_getting_started.md | Theory: docs/theory/00_overview.md | API: docs/08_api_reference.md

Development

Install development dependencies and run tests:

pip install -e ".[dev,torch]"
pytest -m "scm and not benchmark" tests -v

Run linting and type checking:

ruff check zeroproof tests
mypy --strict zeroproof

Run benchmarks:

python benchmarks/run_benchmarks.py --output benchmark_results --suite all

Acknowledgements

The theoretical foundation of this library builds on the pioneering work of Jan A. Bergstra and John V. Tucker on meadows and common meadows—algebraic structures that totalize the field of rational numbers by making division a total operation. Their formalization of division by zero as an absorptive element provides the mathematical rigor underlying our approach.

License

MIT License. See LICENSE for details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

zeroproofml-0.4.3.tar.gz (296.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

zeroproofml-0.4.3-py3-none-any.whl (91.6 kB view details)

Uploaded Python 3

File details

Details for the file zeroproofml-0.4.3.tar.gz.

File metadata

  • Download URL: zeroproofml-0.4.3.tar.gz
  • Upload date:
  • Size: 296.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.13

File hashes

Hashes for zeroproofml-0.4.3.tar.gz
Algorithm Hash digest
SHA256 6acf7052db1aa8ba47a3619e01202045c8d235009c19b90bdc792b0c28337796
MD5 eca2f91861d309b2b529585651c23821
BLAKE2b-256 e04fda7b2a629207f08fd4aef2f2b48c464c6266d5345195e423c94bd39305f3

See more details on using hashes here.

File details

Details for the file zeroproofml-0.4.3-py3-none-any.whl.

File metadata

  • Download URL: zeroproofml-0.4.3-py3-none-any.whl
  • Upload date:
  • Size: 91.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.13

File hashes

Hashes for zeroproofml-0.4.3-py3-none-any.whl
Algorithm Hash digest
SHA256 df7703fc2f62a420223aea02269c25c80b4ed5a6b4b1e9c185fed432a50dd8bb
MD5 0e595cdea18681591bbe12228d6f742b
BLAKE2b-256 44407d9c1394ef9e7944aecb29d91ce1aa30f73b29512fbc68d683d1c1bdbb64

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page