Skip to main content

Multi-runtime GPU + remote inference as a supervised actor system on the atomr actor runtime.

Project description

atomr-infer

A native Rust multi-runtime inference layer built as a supervised actor topology on top of atomr. atomr-infer gives you a single mental model — one Deployment value object, one routing CRDT, one supervision tree — that scales from a single OpenAI-key script to a heterogeneous cluster blending owned GPU hardware with managed APIs. The same actor_ref.tell(msg) lands a request on an H100 two racks away or in another company's data center.

use atomr_infer::prelude::*;

// Same value object describes a vLLM-on-4×H100 replica or a Gemini
// Vertex deployment. The `runtime` field is the only thing that
// changes — and it's auto-inferred from the model name when omitted.
let dep = Deployment {
    name: "gpt-4o-mini".into(),
    model: "gpt-4o-mini".into(),
    runtime: None,
    runtime_config: None,
    gpus: None,
    replicas: 1,
    serving: Serving::default(),
    budget: None,
    idempotent: true,
};

Why multi-runtime inference, in Rust, now

Production AI rarely runs only on owned hardware. Frontier models, burst capacity, and compliance edge cases all push work onto managed APIs. Bolting a provider SDK onto a separate retry / rate-limit / observability stack from your local GPU pool fragments the system — and the cracks are exactly where 3 a.m. pages come from.

Heterogeneous workloads are the norm, not the exception. vLLM on a DGX node, a Candle CPU model in a sidecar, an OpenAI call for the long tail of hard prompts, an Anthropic fallback when OpenAI rate-limits — that's one application, but today it's three SDKs, three retry policies, three observability stacks. atomr-infer treats every runtime as just-another-ModelRunner. The gateway, request actor, and routing CRDT don't know — and don't care — whether a request lands on a local GPU or a remote API.

Cost, latency, and reliability are coupled. A pipeline that classifies cheaply on a local model and escalates to GPT-4o for hard cases is also the pipeline that needs to fall back to Anthropic when OpenAI is saturated and shed traffic when the hourly budget hits. Threading those concerns by hand produces brittle glue. atomr-infer encodes them as composable actors — InferenceCascade, RateLimiterActor (CRDT-backed), CircuitBreakerActor, Budget { on_exceeded: Reject } — under one supervision tree with one trace and one backpressure story.

Granular efficiency. Rust gives us deterministic resource use, zero-cost abstractions, and ownership-as-concurrency-safety. Per-actor footprint stays small; per-message cost stays low. The remote-network tier is HTTP/2 + SSE + connection pooling with structured retry; the local-GPU tier rides on top of atomr-accel's two-tier device supervision. A cargo build --features remote-only produces a binary with zero cudarc, zero atomr-accel, zero candle, zero pyo3 in the dependency graph — the layered crate split makes the invariant load-bearing, not aspirational.

What's in the box

Crate What it does
atomr-infer Umbrella facade re-exporting the public surface, feature-flag-driven
atomr-infer-core Deployment value object, ModelRunner trait, typed InferenceError, batch primitives
atomr-infer-runtime Gateway, request actor, dp-coordinator, engine-core, two-tier worker, placement, deployment manager, metrics
atomr-infer-remote-core Distributed rate limiter (CRDT), circuit breaker, retry/backoff, SSE parser, session lifecycle
atomr-infer-runtime-{openai,anthropic,gemini,litellm} Per-provider ModelRunner against api.openai.com, api.anthropic.com, Vertex AI / AI Studio, and the LiteLLM proxy
atomr-infer-runtime-{vllm,tensorrt,ort,mistralrs} Per-backend ModelRunner for local Rust-native and FFI runtimes; feature-gated so absent system libs don't break the workspace
atomr-infer-pipeline atomr-streams integration plus DynamicBatchingServer / InferenceCascade / ModelReplicaPool / FairShareScheduler / ModelHotSwapServer / SpeculativeDecoder blueprints
atomr-infer-testkit MockRunner + wiremock-backed provider mocks (inject_429, inject_5xx, …)
atomr-infer-cli atomr-infer serve --config <toml>
atomr-infer-py-bindings PyO3 bindings for Cluster / Deployment
atomr-infer-python-bridge PythonGpuBridge + python-pinned dispatcher for vLLM-style runners

Plus a Python facade — pip install atomr-infer — that exposes the same Cluster.connect(...).deploy(deployment) shape from Python.

Quick start (Rust)

The umbrella crate is published on crates.io as atomr-infer:

[dependencies]
atomr-infer = { version = "0.3", features = ["openai", "anthropic", "pipeline"] }

Or pull in subsystem crates directly — atomr-infer-core, atomr-infer-runtime, atomr-infer-remote-core, and the four atomr-infer-runtime-{openai,anthropic,gemini,litellm} providers are all on crates.io.

use atomr_infer::prelude::*;

# async fn run() -> Result<(), Box<dyn std::error::Error>> {
let cluster = Cluster::create("inference", Config::empty()).await?;
cluster.deploy(Deployment {
    name: "gpt-4o-mini".into(),
    model: "gpt-4o-mini".into(),
    replicas: 1,
    ..Default::default()
}).await?;
cluster.serve("0.0.0.0:8080").await?;
# Ok(()) }
# OpenAI-compatible gateway over real (or mocked) providers.
cargo run -p atomr-infer-cli --features all-remote -- serve --config demo.toml

# End-to-end demo (happy path / 429 retry / circuit-open) without
# spending a cent — wiremock under the hood.
cargo run --bin remote_only_demo

# Pure-remote binary, zero GPU deps in the graph.
cargo build -p atomr-infer --no-default-features --features remote-only

Quick start (Python)

python -m venv .venv && source .venv/bin/activate
pip install atomr-infer
from atomr_infer import Cluster, Deployment

cluster = Cluster.connect("inproc://app")
cluster.deploy(Deployment(name="gpt-4o-mini", model="gpt-4o-mini", replicas=1))

The 0.3 surface is intentionally narrow — Deployment value objects and Cluster.connect(...).deploy(...). Decorators and direct ActorRef escape hatches land as the underlying Rust surface stabilises.

Building from source

# Rust
cargo build --workspace
cargo test --workspace
cargo build -p atomr-infer --no-default-features --features remote-only  # zero-GPU build

# Python bindings (requires maturin + a Python dev toolchain)
maturin develop --release
pytest python/tests -v

Crate-layer picker

The workspace splits into layers so a remote-only egress server pulls no GPU dependencies whatsoever, while a heterogeneous cluster pulls exactly the runtimes it serves. Three preset shapes:

Preset What you get What you skip
remote-only OpenAI + Anthropic + Gemini + LiteLLM + pipeline + rate-limiting / circuit-breaker / cost tracking All GPU code
default-prod vLLM + TensorRT + ORT + OpenAI + Anthropic + pipeline Other GPU runtimes; LiteLLM; Gemini
all-runtimes Everything

Detailed feature matrix: docs/feature-matrix.md.

Layout

crates/                       Rust workspace
  atomr-infer-core/           foundation: traits, types, no actor / GPU / HTTP deps
  atomr-infer-runtime/        gateway, request, dp-coordinator, two-tier worker
  atomr-infer-remote-core/    rate limiter (CRDT), circuit breaker, retry, SSE
  atomr-infer-runtime-*/      per-provider / per-backend ModelRunner
  atomr-infer-pipeline/       atomr-streams + batching/cascade/replica blueprints
  atomr-infer-testkit/        MockRunner + wiremock-backed provider mocks
  atomr-infer-cli/            `atomr-infer serve --config <toml>`
  atomr-infer-py-bindings/    PyO3 bindings
  atomr-infer/                rollup
ai-skills/                    Claude / Cursor / Codex / Gemini SKILL.md bundle
docs/                         Architecture (RFC v4), feature matrix, deployment guide
examples/remote_only_demo/    end-to-end happy-path / 429 / circuit-open demo
xtask/                        Cargo xtask (audit, bump, verify, release-checklist)

AI-assisted development

If you're using Claude Code, Cursor, or another AI coding assistant on a project that depends on atomr-infer, install our ai-skills bundle — seven skills covering quickstart, choosing a runtime, wiring remote providers, composing pipelines, deployment, typed-error troubleshooting, and extending with a new backend.

/plugin marketplace add rustakka/atomr-infer
/plugin install atomr-infer-ai-skills@atomr-infer

Each SKILL.md is a thin router into the canonical docs. Other harnesses (Cursor, Codex CLI, Gemini CLI, Aider, etc.) have install instructions in ai-skills/README.md.

Companion bundles for the broader stack:

  • atomr ai-skills — actor design, supervision, persistence, clustering, Python bindings.
  • atomr-accel ai-skills — DeviceActor, kernel selection, two-tier GPU supervision, backend choice.

Learn more

License

Apache-2.0. See LICENSE.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

atomr_infer-0.4.0.tar.gz (69.1 kB view details)

Uploaded Source

Built Distributions

If you're not sure about the file name format, learn more about wheel file names.

atomr_infer-0.4.0-cp313-cp313-win_amd64.whl (134.6 kB view details)

Uploaded CPython 3.13Windows x86-64

atomr_infer-0.4.0-cp313-cp313-musllinux_1_2_x86_64.whl (442.3 kB view details)

Uploaded CPython 3.13musllinux: musl 1.2+ x86-64

atomr_infer-0.4.0-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (228.5 kB view details)

Uploaded CPython 3.13manylinux: glibc 2.17+ x86-64

atomr_infer-0.4.0-cp313-cp313-macosx_10_12_x86_64.macosx_11_0_arm64.macosx_10_12_universal2.whl (399.3 kB view details)

Uploaded CPython 3.13macOS 10.12+ universal2 (ARM64, x86-64)macOS 10.12+ x86-64macOS 11.0+ ARM64

atomr_infer-0.4.0-cp312-cp312-win_amd64.whl (134.6 kB view details)

Uploaded CPython 3.12Windows x86-64

atomr_infer-0.4.0-cp312-cp312-musllinux_1_2_x86_64.whl (442.3 kB view details)

Uploaded CPython 3.12musllinux: musl 1.2+ x86-64

atomr_infer-0.4.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (228.5 kB view details)

Uploaded CPython 3.12manylinux: glibc 2.17+ x86-64

atomr_infer-0.4.0-cp312-cp312-macosx_10_12_x86_64.macosx_11_0_arm64.macosx_10_12_universal2.whl (399.4 kB view details)

Uploaded CPython 3.12macOS 10.12+ universal2 (ARM64, x86-64)macOS 10.12+ x86-64macOS 11.0+ ARM64

atomr_infer-0.4.0-cp311-cp311-win_amd64.whl (134.2 kB view details)

Uploaded CPython 3.11Windows x86-64

atomr_infer-0.4.0-cp311-cp311-musllinux_1_2_x86_64.whl (441.8 kB view details)

Uploaded CPython 3.11musllinux: musl 1.2+ x86-64

atomr_infer-0.4.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (227.9 kB view details)

Uploaded CPython 3.11manylinux: glibc 2.17+ x86-64

atomr_infer-0.4.0-cp311-cp311-macosx_10_12_x86_64.macosx_11_0_arm64.macosx_10_12_universal2.whl (398.7 kB view details)

Uploaded CPython 3.11macOS 10.12+ universal2 (ARM64, x86-64)macOS 10.12+ x86-64macOS 11.0+ ARM64

atomr_infer-0.4.0-cp310-cp310-win_amd64.whl (134.4 kB view details)

Uploaded CPython 3.10Windows x86-64

atomr_infer-0.4.0-cp310-cp310-musllinux_1_2_x86_64.whl (441.9 kB view details)

Uploaded CPython 3.10musllinux: musl 1.2+ x86-64

atomr_infer-0.4.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (228.0 kB view details)

Uploaded CPython 3.10manylinux: glibc 2.17+ x86-64

atomr_infer-0.4.0-cp310-cp310-macosx_10_12_x86_64.macosx_11_0_arm64.macosx_10_12_universal2.whl (398.8 kB view details)

Uploaded CPython 3.10macOS 10.12+ universal2 (ARM64, x86-64)macOS 10.12+ x86-64macOS 11.0+ ARM64

File details

Details for the file atomr_infer-0.4.0.tar.gz.

File metadata

  • Download URL: atomr_infer-0.4.0.tar.gz
  • Upload date:
  • Size: 69.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for atomr_infer-0.4.0.tar.gz
Algorithm Hash digest
SHA256 7c2634cf488c837be838f4382adbf8efb872d1221226ee483a7fa47cc64ebe25
MD5 3c9fd03f7445156e3e4ee9fd0fa6109c
BLAKE2b-256 f22c096a6572ad561f6cbbddd3be5aa0b2890cf8f26812110e3ffdef2f30c0d4

See more details on using hashes here.

Provenance

The following attestation bundles were made for atomr_infer-0.4.0.tar.gz:

Publisher: release.yml on rustakka/atomr-infer

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file atomr_infer-0.4.0-cp313-cp313-win_amd64.whl.

File metadata

  • Download URL: atomr_infer-0.4.0-cp313-cp313-win_amd64.whl
  • Upload date:
  • Size: 134.6 kB
  • Tags: CPython 3.13, Windows x86-64
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for atomr_infer-0.4.0-cp313-cp313-win_amd64.whl
Algorithm Hash digest
SHA256 f07f77ffeff37318e73ef81563a5156964160425cfe04b92f301381348e8a50c
MD5 dd7fc08ec68df26af01b168df069903d
BLAKE2b-256 68cd2c3f0444de026ea2d1063553ca4049107b35086131ef9a31ae474aae78bb

See more details on using hashes here.

Provenance

The following attestation bundles were made for atomr_infer-0.4.0-cp313-cp313-win_amd64.whl:

Publisher: release.yml on rustakka/atomr-infer

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file atomr_infer-0.4.0-cp313-cp313-musllinux_1_2_x86_64.whl.

File metadata

File hashes

Hashes for atomr_infer-0.4.0-cp313-cp313-musllinux_1_2_x86_64.whl
Algorithm Hash digest
SHA256 e30b6e622885d44c15072a87575347bef638a41a3a467dbcf131376a16be405e
MD5 7ced6f59cc3d150f9f401a4be56c30b0
BLAKE2b-256 e77f30c3a067d558e365f2abdc9c7f44a328dd194ef00d75d0662729ca545d4b

See more details on using hashes here.

Provenance

The following attestation bundles were made for atomr_infer-0.4.0-cp313-cp313-musllinux_1_2_x86_64.whl:

Publisher: release.yml on rustakka/atomr-infer

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file atomr_infer-0.4.0-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for atomr_infer-0.4.0-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 f7a055ff71c88123c8fa61572535e581c92969d30bd0fc2ee75e10ee473d2834
MD5 7111d945f2a23fac40acbc0cec458937
BLAKE2b-256 f046b42ed6f2c3cb77d116e425fa31018483a284a2a1f1878bd50c819e6035f3

See more details on using hashes here.

Provenance

The following attestation bundles were made for atomr_infer-0.4.0-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl:

Publisher: release.yml on rustakka/atomr-infer

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file atomr_infer-0.4.0-cp313-cp313-macosx_10_12_x86_64.macosx_11_0_arm64.macosx_10_12_universal2.whl.

File metadata

File hashes

Hashes for atomr_infer-0.4.0-cp313-cp313-macosx_10_12_x86_64.macosx_11_0_arm64.macosx_10_12_universal2.whl
Algorithm Hash digest
SHA256 392fd5c33535d53a87ebf6c0ab12b671742ad9ce63c742e3812568a4c62297b1
MD5 9d1c99517cc67bf288f186c5f93569a5
BLAKE2b-256 a2a5fad2d9dfe82c38079617f882abd353cbfcf517ce78d21408ba8e139eb509

See more details on using hashes here.

Provenance

The following attestation bundles were made for atomr_infer-0.4.0-cp313-cp313-macosx_10_12_x86_64.macosx_11_0_arm64.macosx_10_12_universal2.whl:

Publisher: release.yml on rustakka/atomr-infer

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file atomr_infer-0.4.0-cp312-cp312-win_amd64.whl.

File metadata

  • Download URL: atomr_infer-0.4.0-cp312-cp312-win_amd64.whl
  • Upload date:
  • Size: 134.6 kB
  • Tags: CPython 3.12, Windows x86-64
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for atomr_infer-0.4.0-cp312-cp312-win_amd64.whl
Algorithm Hash digest
SHA256 5154595ec6557ec9e736b6250bfa8ec048bf56b45780ac9334773b69dde26efe
MD5 4018bea791022d839bd57a9095d3925a
BLAKE2b-256 302bfa00cfa79e4d84b14b110c6da2853ec2d5cba2bd5f20e0e93700622ef9b3

See more details on using hashes here.

Provenance

The following attestation bundles were made for atomr_infer-0.4.0-cp312-cp312-win_amd64.whl:

Publisher: release.yml on rustakka/atomr-infer

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file atomr_infer-0.4.0-cp312-cp312-musllinux_1_2_x86_64.whl.

File metadata

File hashes

Hashes for atomr_infer-0.4.0-cp312-cp312-musllinux_1_2_x86_64.whl
Algorithm Hash digest
SHA256 e1074e3490f13c1d93a520e9ac787dceb351790b629dd54922929c7b634ede4c
MD5 c46d2bc13832cd9fbea072947ce6530a
BLAKE2b-256 f208d2892ca03dee7a42ca9cfaa690edc1f6f63f4f036dd7bde2c8f9ebc829c7

See more details on using hashes here.

Provenance

The following attestation bundles were made for atomr_infer-0.4.0-cp312-cp312-musllinux_1_2_x86_64.whl:

Publisher: release.yml on rustakka/atomr-infer

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file atomr_infer-0.4.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for atomr_infer-0.4.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 36c528f054cb8a1cc8f403d4ac51d58382037c4e3ad8189b596849afc4cca57a
MD5 616c4c06d1d39d024a540b37ac3dafff
BLAKE2b-256 b6de46552f873455dcda104a2af5f4aeea5087ab85e6948fd6af5f317272f75e

See more details on using hashes here.

Provenance

The following attestation bundles were made for atomr_infer-0.4.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl:

Publisher: release.yml on rustakka/atomr-infer

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file atomr_infer-0.4.0-cp312-cp312-macosx_10_12_x86_64.macosx_11_0_arm64.macosx_10_12_universal2.whl.

File metadata

File hashes

Hashes for atomr_infer-0.4.0-cp312-cp312-macosx_10_12_x86_64.macosx_11_0_arm64.macosx_10_12_universal2.whl
Algorithm Hash digest
SHA256 8d3945c654402535bfff5e2434d96f4c9d4452fb8e4966b3f843ed97c34ae8d6
MD5 c1807afcf6713dbe56a68dd92412647b
BLAKE2b-256 833a414c10dbda4ddd9b87528db1a10f437d94af569dcaf432d85c2ecda3ae10

See more details on using hashes here.

Provenance

The following attestation bundles were made for atomr_infer-0.4.0-cp312-cp312-macosx_10_12_x86_64.macosx_11_0_arm64.macosx_10_12_universal2.whl:

Publisher: release.yml on rustakka/atomr-infer

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file atomr_infer-0.4.0-cp311-cp311-win_amd64.whl.

File metadata

  • Download URL: atomr_infer-0.4.0-cp311-cp311-win_amd64.whl
  • Upload date:
  • Size: 134.2 kB
  • Tags: CPython 3.11, Windows x86-64
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for atomr_infer-0.4.0-cp311-cp311-win_amd64.whl
Algorithm Hash digest
SHA256 361e835d9b92b2479cab57c317148cafbb46b8f49e1e363f1d8280b6e3462f93
MD5 fa5f27dd7486192a93703d7877efe0d8
BLAKE2b-256 255c7c2f487341eeb85472ce3ff65d55eb06822aee30d2f3f6c753161e57e69c

See more details on using hashes here.

Provenance

The following attestation bundles were made for atomr_infer-0.4.0-cp311-cp311-win_amd64.whl:

Publisher: release.yml on rustakka/atomr-infer

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file atomr_infer-0.4.0-cp311-cp311-musllinux_1_2_x86_64.whl.

File metadata

File hashes

Hashes for atomr_infer-0.4.0-cp311-cp311-musllinux_1_2_x86_64.whl
Algorithm Hash digest
SHA256 620029e250049759457f5e5097b80c8fc6e41338accd2103b907cf8f03468cd0
MD5 4422bb2d71e9ec3a772b8b5d0fa8d899
BLAKE2b-256 1dc1e1b3ae19a0f753218e9a1488c5badf926210769b4578ff7867b57c384dc2

See more details on using hashes here.

Provenance

The following attestation bundles were made for atomr_infer-0.4.0-cp311-cp311-musllinux_1_2_x86_64.whl:

Publisher: release.yml on rustakka/atomr-infer

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file atomr_infer-0.4.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for atomr_infer-0.4.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 d991c43b5afd213f599bc9158aaaec8cb9504b993d79b0d2f82a8721a4feab82
MD5 4a766c8ec32d16563ad3e509ebe3ae04
BLAKE2b-256 08c0e06ec116452d6a2ea7e0f411c62e0edbb180f572ebbde4ba42a1d4aa4aba

See more details on using hashes here.

Provenance

The following attestation bundles were made for atomr_infer-0.4.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl:

Publisher: release.yml on rustakka/atomr-infer

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file atomr_infer-0.4.0-cp311-cp311-macosx_10_12_x86_64.macosx_11_0_arm64.macosx_10_12_universal2.whl.

File metadata

File hashes

Hashes for atomr_infer-0.4.0-cp311-cp311-macosx_10_12_x86_64.macosx_11_0_arm64.macosx_10_12_universal2.whl
Algorithm Hash digest
SHA256 c47db3501c0d9ab66cb71ebd9243d1d82fff8fe836e81241033c9b26bbee20fc
MD5 3fbee31c80cc7b9664d3d3c4c0df6926
BLAKE2b-256 3cf5b9e3844c9a26db5137e790fd4dddb624acb59568c11b5fcf87c8d4500c17

See more details on using hashes here.

Provenance

The following attestation bundles were made for atomr_infer-0.4.0-cp311-cp311-macosx_10_12_x86_64.macosx_11_0_arm64.macosx_10_12_universal2.whl:

Publisher: release.yml on rustakka/atomr-infer

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file atomr_infer-0.4.0-cp310-cp310-win_amd64.whl.

File metadata

  • Download URL: atomr_infer-0.4.0-cp310-cp310-win_amd64.whl
  • Upload date:
  • Size: 134.4 kB
  • Tags: CPython 3.10, Windows x86-64
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for atomr_infer-0.4.0-cp310-cp310-win_amd64.whl
Algorithm Hash digest
SHA256 46c45d5c060524e35913c608fe09ad8ee796e3982a06ecc9420433dcc2d2f0d1
MD5 2b7767885c02df26fa6c6d57e6573e52
BLAKE2b-256 fcee092d9e5fad728add9b118d55a443defc1abd631018665ab47a3a45003c60

See more details on using hashes here.

Provenance

The following attestation bundles were made for atomr_infer-0.4.0-cp310-cp310-win_amd64.whl:

Publisher: release.yml on rustakka/atomr-infer

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file atomr_infer-0.4.0-cp310-cp310-musllinux_1_2_x86_64.whl.

File metadata

File hashes

Hashes for atomr_infer-0.4.0-cp310-cp310-musllinux_1_2_x86_64.whl
Algorithm Hash digest
SHA256 edf6d9a11d4de424ef9c09a1fd7403c51744a107a479c65306d894232fe18d03
MD5 6b6edccf7b1c21352c2a59f12bceabc4
BLAKE2b-256 39859cf90dfd22e3a5710c71747bfbea71915c8183c7b0d08a8305172f3cb9b6

See more details on using hashes here.

Provenance

The following attestation bundles were made for atomr_infer-0.4.0-cp310-cp310-musllinux_1_2_x86_64.whl:

Publisher: release.yml on rustakka/atomr-infer

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file atomr_infer-0.4.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for atomr_infer-0.4.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 27e2d30d45ef429de90605b55f3ebd8f51d442f66e32eb610a1809f937d9e86b
MD5 a83460b6218a60565842549b4d8495bb
BLAKE2b-256 af7352c86ebf7488b0d578ac00feaf15385056e27d93741d28781b740009f3bc

See more details on using hashes here.

Provenance

The following attestation bundles were made for atomr_infer-0.4.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl:

Publisher: release.yml on rustakka/atomr-infer

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file atomr_infer-0.4.0-cp310-cp310-macosx_10_12_x86_64.macosx_11_0_arm64.macosx_10_12_universal2.whl.

File metadata

File hashes

Hashes for atomr_infer-0.4.0-cp310-cp310-macosx_10_12_x86_64.macosx_11_0_arm64.macosx_10_12_universal2.whl
Algorithm Hash digest
SHA256 2a69151795374c1759d896eccd1304ff3f9b60dae76a89e4b77486e105ba0469
MD5 1c71155bfe6695f91d20fad80e299bba
BLAKE2b-256 b5216ddd11bde7a98be943f543138409e5795d729d367d6dec6700917378f6f3

See more details on using hashes here.

Provenance

The following attestation bundles were made for atomr_infer-0.4.0-cp310-cp310-macosx_10_12_x86_64.macosx_11_0_arm64.macosx_10_12_universal2.whl:

Publisher: release.yml on rustakka/atomr-infer

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page