Signed Common Meadow arithmetic for stable machine learning
Project description
ZeroProofML
Machine learning that handles division by zero gracefully
Built on Signed Common Meadow (SCM) semantics for totalized arithmetic
What is ZeroProofML?
ZeroProofML is a PyTorch library that enables neural networks to learn functions with singularities—like 1/x near zero or gravitational potentials—without numerical instabilities or undefined behavior. Instead of treating division by zero as an error, we use Signed Common Meadow (SCM) semantics to provide a mathematically rigorous foundation for arithmetic operations that remain well-defined everywhere.
Key capabilities
- Totalized arithmetic: Singular operations map to an absorptive bottom element
⊥, tracked by explicit masks (and optionally represented asNaNpayloads at strict decode time) - Smooth training: Optional projective mode with
⟨N,D⟩tuples and "ghost gradients" keeps optimization stable near singularities - Strict inference: Decode projective outputs with configurable thresholds (
τ_infer,τ_train) and obtainbottom_mask/gap_maskfor rejection or safe fallbacks - Orientation tracking: Weak-sign protocol provides direction/orientation information near singularities
- PyTorch integration: Drop-in rational layers, SCM-aware losses, and JIT-compatible operations
When to use ZeroProofML
ZeroProofML excels in domains where singularities arise naturally:
- Physics: Gravitational/electrostatic potentials (
1/r), collision dynamics, quantum mechanics - Robotics: Inverse kinematics with joint limits, collision avoidance fields
- Computer vision: Homogeneous coordinates, projective geometry, structure-from-motion
- Scientific computing: Learning differential equations with singular solutions, rational approximations
Quick start
# install with a backend
pip install -e ".[torch]"
import torch
from torch.utils.data import DataLoader, TensorDataset
from zeroproof.inference import InferenceConfig, SCMInferenceWrapper
from zeroproof.layers.projective_rational import ProjectiveRRModelConfig, RRProjectiveRationalModel
from zeroproof.losses.implicit import implicit_loss
from zeroproof.training import SCMTrainer, TrainingConfig
# Toy example: learn a 1D rational function (targets may include inf/NaN for singular labels).
x = torch.linspace(-1.0, 1.0, 2048).unsqueeze(-1)
y = 1.0 / (x + 0.1)
train_loader = DataLoader(TensorDataset(x, y), batch_size=256, shuffle=True)
model = RRProjectiveRationalModel(
ProjectiveRRModelConfig(input_dim=1, output_dim=1, numerator_degree=3, denominator_degree=2)
)
wrapped = SCMInferenceWrapper(model, config=InferenceConfig(tau_infer=1e-6, tau_train=1e-4))
def loss_fn(outputs, lifted_targets):
P, Q = outputs
Y_n, Y_d = lifted_targets
return implicit_loss(P, Q, Y_n, Y_d)
optimizer = torch.optim.AdamW(model.parameters(), lr=1e-3)
trainer = SCMTrainer(
model=model,
optimizer=optimizer,
loss_fn=loss_fn,
train_loader=train_loader,
config=TrainingConfig(max_epochs=20),
)
trainer.fit()
# Strict inference (decoded uses NaN for ⊥ payloads; masks are authoritative).
wrapped.eval()
decoded, bottom_mask, gap_mask = wrapped(x)
See examples/ and documentation_full.md for end-to-end demos and recommended patterns.
How it works
ZeroProofML implements a two-phase training paradigm:
- Training mode (smooth/projective): Networks learn rational functions as projective tuples
⟨N,D⟩with detached normalization, allowing smooth gradients even when denominators approach zero - Inference mode (strict SCM): Outputs are decoded with configurable thresholds; small denominators trigger
⊥and are surfaced viabottom_mask/gap_mask(decoded payloads useNaNfor⊥)
This approach combines the stability of smooth optimization with the rigor of totalized arithmetic, enabling reliable deployment in safety-critical domains.
Architecture
zeroproof/
├── scm/ # SCM values + totalized ops (⊥ semantics)
├── autodiff/ # Gradient policies + projective helpers
├── layers/ # PyTorch SCM layers (rational heads, normalization)
├── losses/ # SCM-aware losses (fit/margin/sign/rejection)
├── training/ # Trainer loop + target lifting utilities
├── inference/ # Export & deployment helpers
├── metrics/ # Pole & singularity metrics
└── utils/ # IEEE↔SCM bridge, visualization
Docs: docs/00_getting_started.md | Conceptual background: concept.tex | Full reference: documentation_full.md | Migration guide: MIGRATION.md
Development
Install development dependencies and run tests:
pip install -e ".[dev,torch]"
pytest -m "scm and not benchmark" tests -v
Run linting and type checking:
ruff check zeroproof tests
mypy --strict zeroproof
Run benchmarks:
python benchmarks/run_benchmarks.py --output benchmark_results --suite all
Acknowledgements
The theoretical foundation of this library builds on the pioneering work of Jan A. Bergstra and John V. Tucker on meadows and common meadows—algebraic structures that totalize the field of rational numbers by making division a total operation. Their formalization of division by zero as an absorptive element provides the mathematical rigor underlying our approach.
We extend these foundations to the neural network setting with weak sign tracking, projective training modes, and integration with modern deep learning frameworks.
Citation
If you use ZeroProofML in your research, please cite:
@software{zeroproofml2025,
title = {ZeroProofML: Signed Common Meadows for Stable Machine Learning},
author = {{ZeroProof Team}},
year = {2025},
url = {https://github.com/domezsolt/zeroproofml},
version = {0.4.0}
}
License
MIT License. See LICENSE for details.
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file zeroproofml-0.4.1.tar.gz.
File metadata
- Download URL: zeroproofml-0.4.1.tar.gz
- Upload date:
- Size: 308.6 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
14fd5c7a96888efad57d304505b0f2bc727235a282c6ec77a493d7a6992c792c
|
|
| MD5 |
d32836cc22857c192b4505bb2d4c5c6f
|
|
| BLAKE2b-256 |
bd6a1143d1ee7a269b1a6d7ec6627ae9d0f2e94bed0091398040c19b5be31b57
|
Provenance
The following attestation bundles were made for zeroproofml-0.4.1.tar.gz:
Publisher:
publish.yml on domezsolt/ZeroProofML
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
zeroproofml-0.4.1.tar.gz -
Subject digest:
14fd5c7a96888efad57d304505b0f2bc727235a282c6ec77a493d7a6992c792c - Sigstore transparency entry: 799177884
- Sigstore integration time:
-
Permalink:
domezsolt/ZeroProofML@a98e9ca68c0d8b162301561d364ae91d630fb21e -
Branch / Tag:
refs/tags/v0.4.1-alpha - Owner: https://github.com/domezsolt
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@a98e9ca68c0d8b162301561d364ae91d630fb21e -
Trigger Event:
release
-
Statement type:
File details
Details for the file zeroproofml-0.4.1-py3-none-any.whl.
File metadata
- Download URL: zeroproofml-0.4.1-py3-none-any.whl
- Upload date:
- Size: 55.3 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
50e671fc7de8e215129b066cc0e17c4060c8462a8bb2001d263892f0eb8821f9
|
|
| MD5 |
b48aaef50a8a7614b74a7b5f048fdb16
|
|
| BLAKE2b-256 |
628e19eff5c54b7c763ae2f8d10d7bc90dee2679dbed71cf72d1c7a55904e82e
|
Provenance
The following attestation bundles were made for zeroproofml-0.4.1-py3-none-any.whl:
Publisher:
publish.yml on domezsolt/ZeroProofML
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
zeroproofml-0.4.1-py3-none-any.whl -
Subject digest:
50e671fc7de8e215129b066cc0e17c4060c8462a8bb2001d263892f0eb8821f9 - Sigstore transparency entry: 799177885
- Sigstore integration time:
-
Permalink:
domezsolt/ZeroProofML@a98e9ca68c0d8b162301561d364ae91d630fb21e -
Branch / Tag:
refs/tags/v0.4.1-alpha - Owner: https://github.com/domezsolt
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@a98e9ca68c0d8b162301561d364ae91d630fb21e -
Trigger Event:
release
-
Statement type: