PyTorch profiler plugin for Pfaffian depth/width per layer.
Project description
eml-cost-torch
Pre-release. Patent pending. Not for redistribution.
PyTorch profiler plugin: per-layer Pfaffian profile of a torch.nn.Module.
Status
Pre-release. Covered by Monogate Research patents #11 and #12.
Quick start
import torch.nn as nn
from eml_cost_torch import profile
model = nn.Sequential(
nn.Linear(128, 64),
nn.ReLU(),
nn.Linear(64, 32),
nn.GELU(),
nn.Linear(32, 10),
)
p = profile(model)
print(p.total_layers) # 5
print(p.total_pfaffian_depth) # 0 — all r=0 (ReLU=0, GELU=non-EML, Linear=0)
print(p.transcendental_layer_count) # 0
print(p.non_eml_layer_count) # 1 — GELU uses erf
print(p.estimated_pfaffian_width) # 0 — no softmax/attention
for layer in p.layers:
print(f" {layer.name:8s} {layer.activation:30s} r={layer.pfaffian_r}")
What it does
Walks the module graph statically (does NOT execute the model), classifies each layer against an internal registry of activation/operator types, and returns a structured profile.
Registry covers ~50 standard torch.nn modules:
- Linear / Conv / Norm:
r=0(polynomial) - ReLU family / Hard sigmoid / Hard swish:
r=0 - Sigmoid, Tanh, Softplus, ELU, SiLU/Swish:
r=1 - Mish:
r=3 - GELU: flagged
is_pfaffian_not_eml=True(uses erf, outside EML class) - Softmax / MultiheadAttention: contributes to
estimated_pfaffian_width
Why this matters for architecture search
Per the research validating patent #11: NN training cost correlates with Pfaffian width. This profile gives a static input you can use as a search heuristic before training.
Library API
from eml_cost_torch import profile, ModelProfile, LayerProfile
p: ModelProfile = profile(model)
# Aggregate fields
p.layers # list[LayerProfile]
p.total_layers # int
p.total_pfaffian_depth # sum of r over all layers
p.total_eml_depth # sum of depth
p.transcendental_layer_count # count of layers with r >= 1
p.non_eml_layer_count # count of layers using non-EML primitives (e.g., GELU)
p.estimated_pfaffian_width # parallel-chain count (softmax + attention)
p.total_params # parameter count
# Per-layer fields
layer.name # named_modules path
layer.cls_name # Python class name
layer.activation # friendly description
layer.pfaffian_r # chain order
layer.eml_depth # routing depth
layer.is_pfaffian_not_eml # True for GELU and similar
layer.n_params # parameters at this layer
License
PROPRIETARY-PRE-RELEASE. See LICENSE.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file eml_cost_torch-0.1.0a0.tar.gz.
File metadata
- Download URL: eml_cost_torch-0.1.0a0.tar.gz
- Upload date:
- Size: 7.4 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.14.3
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
4f7cf101bfc111b27aec88ee4ab65226e0161dea16cc42fbd05e6c99f2705fd9
|
|
| MD5 |
4e9a9bc155c68ef05b287ab599c1c3e4
|
|
| BLAKE2b-256 |
2aba4f9f1114ff968b9d665c75e2daafea2aafd5a36c21a23165e2cfe4bd2ee8
|
File details
Details for the file eml_cost_torch-0.1.0a0-py3-none-any.whl.
File metadata
- Download URL: eml_cost_torch-0.1.0a0-py3-none-any.whl
- Upload date:
- Size: 6.6 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.14.3
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
c735bc53974a5715b2b459832abdc1166445558743faef4a143b921dc5e2ac35
|
|
| MD5 |
aefaafeaea52ec68198b58e27953de7a
|
|
| BLAKE2b-256 |
349332744132b5b9c0be7b1e9ac1330b520e21400d2b71a7b907beb1f7245cc7
|