Multi-runtime GPU + remote inference as a supervised actor system on the rakka actor runtime.
Project description
rakka-inference
One supervised actor topology for every place a model can run. Local GPU runtimes (vLLM, TensorRT, ONNX Runtime, Candle, cudarc, mistral.rs) and managed APIs (OpenAI, Anthropic, Gemini, LiteLLM) sit under the same routing CRDT, the same supervision tree, the same backpressure story. A request doesn't know — and doesn't need to — whether it landed on an H100 two racks away or in another company's data center.
[dependencies]
inference = { version = "0.2", features = ["openai", "anthropic", "candle", "pipeline"] }
use inference::prelude::*;
// Same value object describes a vLLM-on-4×H100 replica or a Gemini Vertex
// deployment. The `runtime` field is the only thing that changes —
// and it's auto-inferred from the model name when omitted.
let dep = Deployment {
name: "gpt-4o-mini".into(),
model: "gpt-4o-mini".into(),
runtime: None,
runtime_config: None,
gpus: None,
replicas: 1,
serving: Serving::default(),
budget: None,
idempotent: true,
};
Built on rakka for actor supervision, clustering, and
CRDTs, and on rakka-accel for two-tier GPU
supervision. Cost, latency, and reliability stop being three pipelines
and become one.
Why
Production AI rarely runs only on owned hardware. Frontier models, burst capacity, and compliance edge cases all push work onto managed APIs. Bolting providers onto a separate retry / rate-limit / observability stack from your local GPU pool fragments the system — and the cracks are exactly where 3 a.m. pages come from.
| You'd otherwise hand-roll | rakka-inference gives you |
|---|---|
| One routing layer for local pools, another for the API SDK | Single routing CRDT — gpt-4o and llama-3.1-70b resolve through the same path |
| Per-process token buckets that 429 on cluster scale-out | RateLimiterActor over rakka_distributed_data::GCounter — one bucket, all nodes |
| Hand-written retry / breaker / backoff per provider | CircuitBreakerActor + jittered retry + content-filter triage, one strategy |
| Sticky CUDA-context recovery glued to async tasks | rakka_accel::error::device_supervisor_strategy() adopted unchanged |
| Cascade graphs duct-taped from threadpools and channels | InferenceCascade / DynamicBatchingServer / ModelReplicaPool actors |
| Credential rotation that drops in-flight traffic | RemoteSessionActor::rebuild drains old, routes new — zero dropped requests |
A no-GPU egress server that still pulls cudarc transitively |
--features remote-only ⇒ cudarc, rakka-accel, candle not in the graph |
| Cost guardrails as Slack alerts after the bill arrives | Budget { max_spend_per_hour_usd, on_exceeded: Reject } enforced at the actor |
Every concern that's normally a separate library or a separate incident is folded into one supervised graph with typed messages.
30-second tour
# Stand up an OpenAI-compatible gateway over real (or mocked) providers.
cargo run -p inference-cli --features all-remote -- serve --config demo.toml
# End-to-end demo (happy path / 429 retry / circuit-open) without
# spending a cent — wiremock under the hood.
cargo run --bin remote_only_demo
# Pure-remote binary, zero GPU deps in the graph.
cargo build -p inference --no-default-features --features remote-only
Architecture
The full design lives in
docs/rustakka-inference-architecture-v4.md
(1,459 lines, RFC v4). Short version:
[HTTP clients]
│
▼
ApiGatewayActor runtime-agnostic
│ spawns one per request (inference-runtime)
▼
RequestActor
│ ask(routing target)
▼
DpCoordinatorActor cluster-singleton
│ tell(AddRequest)
▼
┌───────────────┴───────────────┐
▼ ▼
EngineCoreActor (LOCAL) RemoteEngineCoreActor (REMOTE)
┌──────────────────────┐ ┌────────────────────────────┐
│ scheduler/batcher │ │ request queue (priority) │
│ kv_cache_mgr (LLM) │ │ rate-limit-aware dispatch │
│ ModelExecutorActor │ │ ┌─────────────────────────┐│
│ ├─ WorkerActor │ │ │ WorkerPool ││
│ │ └─ ContextActor│ │ │ ├─ RemoteWorkerActor ││
│ │ ├─ ModelRunner │ │ └─ RemoteWorkerActor ││
│ │ └─ rakka_accel::* │ └─────────────────────────┘│
│ └─ ... │ uses: │
└──────────────────────┘ │ RateLimiterActor (CRDT) │
│ CircuitBreakerActor │
│ RemoteSessionActor │
└────────────────────────────┘
The local-GPU tier rides on top of rakka-accel's
substrate: DeviceActor, ContextActor, GpuRef<T>, GpuDispatcher,
PerActorAllocator, PlacementActor, BlasActor/CudnnActor/etc.
We don't reinvent two-tier supervision; we adopt
rakka_accel::error::device_supervisor_strategy() and add the
inference-specific Box<dyn ModelRunner> slot on top.
The remote-network tier is HTTP/2 + SSE + connection pooling, with
distributed rate limiting via rakka_distributed_data::GCounter and
circuit breaking + retry/backoff inside
inference-remote-core.
Crate layout — pick what you need
The workspace is 18 crates plus xtask and the demo. Each layer is
optional via Cargo features so you only compile what you use. Three
recommended preset shapes:
| Preset | What you get | What you skip |
|---|---|---|
remote-only |
OpenAI + Anthropic + Gemini + LiteLLM + pipeline + rate-limiting / circuit-breaker / cost tracking | All GPU code (cudarc, rakka-accel, candle, pyo3) |
default-prod |
vLLM + TensorRT + ORT + OpenAI + Anthropic + pipeline | Other GPU runtimes; LiteLLM; Gemini |
all-runtimes |
Everything | — |
Detailed feature matrix:
docs/feature-matrix.md.
inference ← rollup; one dep, feature-flag-driven
│
├── inference-core ← traits, types, no actor / GPU / HTTP deps
│
├── inference-runtime ← gateway, request, dp-coordinator,
│ [feature: local-gpu → rakka-accel] engine-core, worker (two-tier),
│ placement, deployment-mgr, metrics
│
├── inference-remote-core ← rate limiter (GCounter CRDT),
│ circuit breaker, retry/backoff,
│ SSE parser, session lifecycle
│
├── inference-runtime-{openai, anthropic, gemini, ← per-provider ModelRunner + cost table
│ litellm}
│
├── inference-runtime-{vllm, tensorrt, ort, candle, ← per-backend ModelRunner; feature-gated
│ cudarc, mistralrs} so absent system libs don't break the
│ workspace build
│
├── inference-python-bridge ← PythonGpuBridge + python-pinned dispatcher
│ [feature: python → pyo3] (will lift to rakka-accel F4 — see TODO)
│
├── inference-pipeline ← rakka-streams + re-export of
│ [feature: cuda-patterns → rakka-accel-patterns] DynamicBatchingServer / InferenceCascade /
│ ModelReplicaPool / FairShareScheduler /
│ ModelHotSwapServer / SpeculativeDecoder
│
├── inference-testkit ← MockRunner + wiremock-backed provider
│ mocks (inject_429, inject_5xx, ...)
│
├── inference-cli ← `rakka serve --config <toml>`
│
└── inference-py-bindings ← PyO3 bindings for Cluster / Deployment
[feature: python]
How to add only the runtimes you need
# Just OpenAI + Anthropic, no GPU code, no Python:
inference = { workspace = true, features = ["openai", "anthropic", "pipeline"] }
# Local Candle + remote OpenAI fallback:
inference = { workspace = true, features = ["candle", "openai", "pipeline"] }
# (Pulls rakka-accel + cudarc + candle-* automatically via the `candle` feature.)
# Everything, including the testkit:
inference = { workspace = true, features = ["all-runtimes", "testkit"] }
The rollup's job is exactly this: make Cargo.toml declare intent
and let the feature graph compute deps.
What you don't have to think about
- Two-tier GPU supervision.
local-gpuwiresWorkerActor/ContextActortorakka_accel::error::device_supervisor_strategy(). Sticky-error CUDA contexts getRestart; OOM getsResume; unrecoverable failuresStop. No panic-string parsing in your code. - Distributed rate limits.
RateLimiterActorshares its token-spent log across cluster nodes throughrakka_distributed_data::GCounter. Two members calling OpenAI on the same API key collectively respect the bucket — no surprise 429 storms on scale-out. - Typed circuit-breaker propagation. When the breaker opens, the
caller sees
InferenceError::CircuitOpen { provider, opened_at_unix_ms, retry_at_unix_ms }. Fall back, surface a 429, or queue — without knowing whether the bottleneck was GPU memory or a remote outage. - Pipelines from blueprints, not threadpools. Enable
cuda-patternsandinference::cuda_patterns::{DynamicBatchingServer, InferenceCascade, ModelReplicaPool, FairShareScheduler, ModelHotSwapServer, SpeculativeDecoder, MoeRouter}are one import away. Plug a closure intoModelRunner::executeand you've composed §9 of the architecture doc. - Compile-time dependency budgets.
cargo build -p inference --features remote-onlyproduces a binary with zerocudarc, zerorakka-accel, zerocandle, zeropyo3in the graph. Layered crates make the invariant load-bearing, not aspirational. - Hot credential rotation.
RemoteSessionActor::rebuilddrains in-flight requests on the old credential and routes new ones on the rotated value. Zero dropped traffic.
Developer experience
Six layers, surface up to depth
Deploymentvalue object. Most users never go deeper.runtimeis auto-inferred from model name when omitted (gpt-*→ openai,claude-*→ anthropic, …).- Per-runtime configs.
OpenAiConfig,AnthropicConfig,GeminiConfig(Vertex + AI Studio),LiteLlmConfig,CandleConfig,VllmConfig, etc. for explicit overrides. <config>.tomlproject files.rakka serve --config foo.tomlreads the §11.3 schema and applies every[[deployment]].- Python decorators.
@inference_actorfor orchestration actors that compose deployments without touching a GPU directly. Skeleton ininference-py-bindings. - Escape hatches.
cluster.deployment("gpt-4o").rate_limiter(),.circuit_breaker(),.workers()— directActorRefs for incident response (force_open,rebuild_session, etc.). - Raw rakka actors. When you need it, you have the full actor system underneath. Unprivileged.
Footgun-resistant by design
- Secrets are typed.
inference_core::SecretString(re-export ofsecrecy::SecretString) — won'tDebug, won'tDisplay, never appears in logs. - Rate-limit validation at deploy time. Catches a deployment
claiming
rpm = 100_000against a free-tier API key with a typed error before the first user request hits. - Network egress checked at deploy time. The placement actor pings
the provider from each chosen node before flipping the deployment to
Serving. - Hot-swappable credentials. Updating the secret source triggers
RemoteSessionActor::rebuildon the next pulse; in-flight requests drain on the old credential, new ones use the rotated value. Zero dropped traffic. - Cost guardrails.
Budget { max_spend_per_hour_usd, on_exceeded: Reject }on aDeploymentmakes runaway provider spend physically impossible.
Verification
Every PR runs:
cargo build --workspace
cargo build -p inference --features remote-only # zero GPU deps
cargo build -p inference --features cuda,cuda-patterns # local + patterns
cargo build -p inference --features all-runtimes
cargo test --workspace
cargo run --bin remote_only_demo
The demo asserts the §13 Phase-1 + Phase-2c exit criteria end-to-end
against a wiremock-driven OpenAI mock: happy-path streaming, 429
retry-after, and circuit-breaker open after consecutive 5xx.
Status
| Layer | Status |
|---|---|
Foundation (inference-core) |
✅ stable surface; serde round-trips for every RuntimeConfig variant |
| Runtime-agnostic actors | ✅ gateway, request, dp-coordinator, engine-core, worker, placement, manager, metrics |
| Remote infrastructure | ✅ rate limiter (CRDT), strict variant (singleton), circuit breaker, retry, SSE, session |
| OpenAI / Anthropic / Gemini / LiteLLM | ✅ ModelRunner + wire types + error classification + pricing tables |
| Local Rust-native runtimes | 🟡 trait satisfied; forward-pass bodies are stubs pinned to the doc's §13 Phase 2b roadmap |
| vLLM / TensorRT FFI | 🟡 stubs that compile against the trait; full bodies on §13 Phase 2a/2b |
| Pipeline (rakka-streams + cuda-patterns) | ✅ re-export shim + reference hybrid graph |
CLI (rakka serve) |
✅ TOML config → ActorSystem → gateway; cost-report/rotate-credentials are stubs |
| Python bindings | 🟡 PyO3 skeleton (Cluster, Deployment); decorator surface deferred |
AI-assisted development
If you're using Claude Code, Cursor, or another AI coding assistant on
a project that depends on rakka-inference, install our
ai-skills bundle — seven skills covering quickstart,
choosing a runtime, wiring remote providers, composing pipelines,
deployment, typed-error troubleshooting, and extending with a new
backend.
/plugin marketplace add rustakka/rakka-inference
/plugin install rakka-inference-ai-skills@rakka-inference
Each SKILL.md is a thin router into the canonical docs (this README,
the per-crate READMEs, the architecture RFC) so the skills stay in
sync with the code instead of restating API surfaces that belong in
rustdoc. Other harnesses (Cursor, Codex CLI, Gemini CLI, Aider, etc.)
have install instructions in ai-skills/README.md.
Companion bundles for the broader stack:
rakkaai-skills — actor design, supervision, persistence, clustering, Python bindings.rakka-accelai-skills — DeviceActor, kernel selection, two-tier GPU supervision, backend choice.
Install all three when you're building a service that uses rakka primitives, rakka-accel GPU acceleration, and rakka-inference runtimes.
Release management
Releases are fully automated. Land a feat: / fix: commit on main
and the version-bump workflow tags vX.Y.Z; the release workflow
fires on the tag, runs cargo xtask verify, builds binaries for five
platforms, generates release notes from git log, and publishes the
allowlisted crates to crates.io in dependency order with idempotent
retry.
| Task | How |
|---|---|
| Bump + tag based on Conventional Commits | Auto on push to main via .github/workflows/version-bump.yml. |
| Force a specific version | Release-As: x.y.z in commit footer. |
| Run the full release pipeline manually | Actions → Release → Run workflow. |
| Dry-run before tagging | Actions → Release → Run workflow → dry_run: true. |
| Inspect publishable vs gated crates | cargo xtask release-checklist. |
| Audit anti-pattern regressions | cargo xtask audit / cargo xtask audit --check. |
| Run the same checks CI runs | cargo xtask verify. |
Full operator runbook: RELEASING.md.
Contributor guide: CONTRIBUTING.md.
License
Apache-2.0. See LICENSE once it lands; the workspace
inherits the rakka project license.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distributions
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file rakka_inference-0.2.1.tar.gz.
File metadata
- Download URL: rakka_inference-0.2.1.tar.gz
- Upload date:
- Size: 74.1 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
986c84b415be11668a74532b5b16d4b2be59b1ebd62a38a760d2e408090a7530
|
|
| MD5 |
e04608a861510689390f038f56c7f514
|
|
| BLAKE2b-256 |
71076af37472b7dbc4e08e031222c57ff5ab1e9bb8c6c83d70aedd5074ecfebe
|
Provenance
The following attestation bundles were made for rakka_inference-0.2.1.tar.gz:
Publisher:
release.yml on rustakka/rakka-inference
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
rakka_inference-0.2.1.tar.gz -
Subject digest:
986c84b415be11668a74532b5b16d4b2be59b1ebd62a38a760d2e408090a7530 - Sigstore transparency entry: 1436188830
- Sigstore integration time:
-
Permalink:
rustakka/rakka-inference@b10d162253b95f5c8e3f7a0834b02f5b91c805f3 -
Branch / Tag:
refs/tags/v0.2.3 - Owner: https://github.com/rustakka
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
release.yml@b10d162253b95f5c8e3f7a0834b02f5b91c805f3 -
Trigger Event:
push
-
Statement type:
File details
Details for the file rakka_inference-0.2.1-cp313-cp313-win_amd64.whl.
File metadata
- Download URL: rakka_inference-0.2.1-cp313-cp313-win_amd64.whl
- Upload date:
- Size: 137.2 kB
- Tags: CPython 3.13, Windows x86-64
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
089b951d80f2130287d25cd91c2ba816c034f43c3cf1a12d6227e2a8b0a8e682
|
|
| MD5 |
81004257e4743572976fae37532955ba
|
|
| BLAKE2b-256 |
5036e5b738dc8861a2cd3d3e617ea8b6c63018e3315719084685d8a215ca1ccb
|
Provenance
The following attestation bundles were made for rakka_inference-0.2.1-cp313-cp313-win_amd64.whl:
Publisher:
release.yml on rustakka/rakka-inference
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
rakka_inference-0.2.1-cp313-cp313-win_amd64.whl -
Subject digest:
089b951d80f2130287d25cd91c2ba816c034f43c3cf1a12d6227e2a8b0a8e682 - Sigstore transparency entry: 1436188842
- Sigstore integration time:
-
Permalink:
rustakka/rakka-inference@b10d162253b95f5c8e3f7a0834b02f5b91c805f3 -
Branch / Tag:
refs/tags/v0.2.3 - Owner: https://github.com/rustakka
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
release.yml@b10d162253b95f5c8e3f7a0834b02f5b91c805f3 -
Trigger Event:
push
-
Statement type:
File details
Details for the file rakka_inference-0.2.1-cp313-cp313-musllinux_1_2_x86_64.whl.
File metadata
- Download URL: rakka_inference-0.2.1-cp313-cp313-musllinux_1_2_x86_64.whl
- Upload date:
- Size: 444.9 kB
- Tags: CPython 3.13, musllinux: musl 1.2+ x86-64
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
54ec833f1a10af8e2cc3d52a51f34f23d41c900207bd3343076de53d99b28967
|
|
| MD5 |
5fb91c8ea41c2807dbe9906d46db05f8
|
|
| BLAKE2b-256 |
feadc7641a80c5d0215c95a2f14bb3a58a702a178386c1935af731f2f06850c2
|
Provenance
The following attestation bundles were made for rakka_inference-0.2.1-cp313-cp313-musllinux_1_2_x86_64.whl:
Publisher:
release.yml on rustakka/rakka-inference
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
rakka_inference-0.2.1-cp313-cp313-musllinux_1_2_x86_64.whl -
Subject digest:
54ec833f1a10af8e2cc3d52a51f34f23d41c900207bd3343076de53d99b28967 - Sigstore transparency entry: 1436188879
- Sigstore integration time:
-
Permalink:
rustakka/rakka-inference@b10d162253b95f5c8e3f7a0834b02f5b91c805f3 -
Branch / Tag:
refs/tags/v0.2.3 - Owner: https://github.com/rustakka
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
release.yml@b10d162253b95f5c8e3f7a0834b02f5b91c805f3 -
Trigger Event:
push
-
Statement type:
File details
Details for the file rakka_inference-0.2.1-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.
File metadata
- Download URL: rakka_inference-0.2.1-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
- Upload date:
- Size: 231.0 kB
- Tags: CPython 3.13, manylinux: glibc 2.17+ x86-64
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
7363a821cc64554be6387de7ba29e036bb798b7fd7a4035c684a775fa2112497
|
|
| MD5 |
7d259bdba22a163f68f043eb8943594c
|
|
| BLAKE2b-256 |
d1f8f4b760e7199da53a4067513b8f28aeb2bf3bfbfd009c736bfde71665256f
|
Provenance
The following attestation bundles were made for rakka_inference-0.2.1-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl:
Publisher:
release.yml on rustakka/rakka-inference
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
rakka_inference-0.2.1-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl -
Subject digest:
7363a821cc64554be6387de7ba29e036bb798b7fd7a4035c684a775fa2112497 - Sigstore transparency entry: 1436188878
- Sigstore integration time:
-
Permalink:
rustakka/rakka-inference@b10d162253b95f5c8e3f7a0834b02f5b91c805f3 -
Branch / Tag:
refs/tags/v0.2.3 - Owner: https://github.com/rustakka
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
release.yml@b10d162253b95f5c8e3f7a0834b02f5b91c805f3 -
Trigger Event:
push
-
Statement type:
File details
Details for the file rakka_inference-0.2.1-cp313-cp313-macosx_10_12_x86_64.macosx_11_0_arm64.macosx_10_12_universal2.whl.
File metadata
- Download URL: rakka_inference-0.2.1-cp313-cp313-macosx_10_12_x86_64.macosx_11_0_arm64.macosx_10_12_universal2.whl
- Upload date:
- Size: 401.9 kB
- Tags: CPython 3.13, macOS 10.12+ universal2 (ARM64, x86-64), macOS 10.12+ x86-64, macOS 11.0+ ARM64
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
155cb374d4e7c0d8657c4a1f39438ab0ac9350ada4f14d1da32ae63609136efb
|
|
| MD5 |
0ce2c6a6e33bc094f01f9d2a3652e9cb
|
|
| BLAKE2b-256 |
97d8933af6452cadd5440808c7ad2a3daa24df18272da247c1fa8a91aa8dbf46
|
Provenance
The following attestation bundles were made for rakka_inference-0.2.1-cp313-cp313-macosx_10_12_x86_64.macosx_11_0_arm64.macosx_10_12_universal2.whl:
Publisher:
release.yml on rustakka/rakka-inference
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
rakka_inference-0.2.1-cp313-cp313-macosx_10_12_x86_64.macosx_11_0_arm64.macosx_10_12_universal2.whl -
Subject digest:
155cb374d4e7c0d8657c4a1f39438ab0ac9350ada4f14d1da32ae63609136efb - Sigstore transparency entry: 1436188847
- Sigstore integration time:
-
Permalink:
rustakka/rakka-inference@b10d162253b95f5c8e3f7a0834b02f5b91c805f3 -
Branch / Tag:
refs/tags/v0.2.3 - Owner: https://github.com/rustakka
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
release.yml@b10d162253b95f5c8e3f7a0834b02f5b91c805f3 -
Trigger Event:
push
-
Statement type:
File details
Details for the file rakka_inference-0.2.1-cp312-cp312-win_amd64.whl.
File metadata
- Download URL: rakka_inference-0.2.1-cp312-cp312-win_amd64.whl
- Upload date:
- Size: 137.2 kB
- Tags: CPython 3.12, Windows x86-64
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
14ac21810d11dd44e7465a5b22dc79252b2e9cd2ab04f7d1777392ed271e2ea5
|
|
| MD5 |
1c9015bf2715cf88d6f7ba06c4910aa3
|
|
| BLAKE2b-256 |
35fd3c0ea3b668f57593a48f7f36f6f7f374d081f12e4d491638adb43b08c6c7
|
Provenance
The following attestation bundles were made for rakka_inference-0.2.1-cp312-cp312-win_amd64.whl:
Publisher:
release.yml on rustakka/rakka-inference
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
rakka_inference-0.2.1-cp312-cp312-win_amd64.whl -
Subject digest:
14ac21810d11dd44e7465a5b22dc79252b2e9cd2ab04f7d1777392ed271e2ea5 - Sigstore transparency entry: 1436188860
- Sigstore integration time:
-
Permalink:
rustakka/rakka-inference@b10d162253b95f5c8e3f7a0834b02f5b91c805f3 -
Branch / Tag:
refs/tags/v0.2.3 - Owner: https://github.com/rustakka
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
release.yml@b10d162253b95f5c8e3f7a0834b02f5b91c805f3 -
Trigger Event:
push
-
Statement type:
File details
Details for the file rakka_inference-0.2.1-cp312-cp312-musllinux_1_2_x86_64.whl.
File metadata
- Download URL: rakka_inference-0.2.1-cp312-cp312-musllinux_1_2_x86_64.whl
- Upload date:
- Size: 444.9 kB
- Tags: CPython 3.12, musllinux: musl 1.2+ x86-64
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
96f81c4dbc0f5082a0ff2203678e788ee624e2add3969a3716be4315e6ddb36e
|
|
| MD5 |
b0b7029ec8b84ecb7d0c5b3c0a706dcc
|
|
| BLAKE2b-256 |
3dcbfa2d6dcd542a494ad0a5d1156070bcc0d31dc67591426dd1a3d6dfc25b57
|
Provenance
The following attestation bundles were made for rakka_inference-0.2.1-cp312-cp312-musllinux_1_2_x86_64.whl:
Publisher:
release.yml on rustakka/rakka-inference
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
rakka_inference-0.2.1-cp312-cp312-musllinux_1_2_x86_64.whl -
Subject digest:
96f81c4dbc0f5082a0ff2203678e788ee624e2add3969a3716be4315e6ddb36e - Sigstore transparency entry: 1436188855
- Sigstore integration time:
-
Permalink:
rustakka/rakka-inference@b10d162253b95f5c8e3f7a0834b02f5b91c805f3 -
Branch / Tag:
refs/tags/v0.2.3 - Owner: https://github.com/rustakka
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
release.yml@b10d162253b95f5c8e3f7a0834b02f5b91c805f3 -
Trigger Event:
push
-
Statement type:
File details
Details for the file rakka_inference-0.2.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.
File metadata
- Download URL: rakka_inference-0.2.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
- Upload date:
- Size: 231.1 kB
- Tags: CPython 3.12, manylinux: glibc 2.17+ x86-64
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
843c52f2559ba0280d98a8ee8d8f7eeeb3333550d822509fe0e9715307aad0e4
|
|
| MD5 |
863c325d76d073dcba087633655364db
|
|
| BLAKE2b-256 |
0f842e5cf56423e9436f1dfbb94d912cdf83bb21fcb3d49f63ad9dc88b9df4eb
|
Provenance
The following attestation bundles were made for rakka_inference-0.2.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl:
Publisher:
release.yml on rustakka/rakka-inference
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
rakka_inference-0.2.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl -
Subject digest:
843c52f2559ba0280d98a8ee8d8f7eeeb3333550d822509fe0e9715307aad0e4 - Sigstore transparency entry: 1436188844
- Sigstore integration time:
-
Permalink:
rustakka/rakka-inference@b10d162253b95f5c8e3f7a0834b02f5b91c805f3 -
Branch / Tag:
refs/tags/v0.2.3 - Owner: https://github.com/rustakka
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
release.yml@b10d162253b95f5c8e3f7a0834b02f5b91c805f3 -
Trigger Event:
push
-
Statement type:
File details
Details for the file rakka_inference-0.2.1-cp312-cp312-macosx_10_12_x86_64.macosx_11_0_arm64.macosx_10_12_universal2.whl.
File metadata
- Download URL: rakka_inference-0.2.1-cp312-cp312-macosx_10_12_x86_64.macosx_11_0_arm64.macosx_10_12_universal2.whl
- Upload date:
- Size: 402.0 kB
- Tags: CPython 3.12, macOS 10.12+ universal2 (ARM64, x86-64), macOS 10.12+ x86-64, macOS 11.0+ ARM64
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
b0fdfb8a8199b6bb6c5cfbadc87cc188240e7927ad27b7fd90a461c95d259859
|
|
| MD5 |
9cd26690b2b3db68c8c401cb3d57d5cd
|
|
| BLAKE2b-256 |
4bbb9a5c0862a9017ab6a9e99e82d2c7b8f4eb497759b8bbd2cfac0c864f1007
|
Provenance
The following attestation bundles were made for rakka_inference-0.2.1-cp312-cp312-macosx_10_12_x86_64.macosx_11_0_arm64.macosx_10_12_universal2.whl:
Publisher:
release.yml on rustakka/rakka-inference
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
rakka_inference-0.2.1-cp312-cp312-macosx_10_12_x86_64.macosx_11_0_arm64.macosx_10_12_universal2.whl -
Subject digest:
b0fdfb8a8199b6bb6c5cfbadc87cc188240e7927ad27b7fd90a461c95d259859 - Sigstore transparency entry: 1436188836
- Sigstore integration time:
-
Permalink:
rustakka/rakka-inference@b10d162253b95f5c8e3f7a0834b02f5b91c805f3 -
Branch / Tag:
refs/tags/v0.2.3 - Owner: https://github.com/rustakka
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
release.yml@b10d162253b95f5c8e3f7a0834b02f5b91c805f3 -
Trigger Event:
push
-
Statement type:
File details
Details for the file rakka_inference-0.2.1-cp311-cp311-win_amd64.whl.
File metadata
- Download URL: rakka_inference-0.2.1-cp311-cp311-win_amd64.whl
- Upload date:
- Size: 136.8 kB
- Tags: CPython 3.11, Windows x86-64
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
fda0972e0e29f0b401ce25db41330b5f8e16c0f0bb225a88e3ff06204fe66501
|
|
| MD5 |
e69278f81e3dd82b3ce02a1aa8fdc8c2
|
|
| BLAKE2b-256 |
1550588459a29654961c8e0415ed2649ddaf82e8c984cb1d29ed88c343692ad4
|
Provenance
The following attestation bundles were made for rakka_inference-0.2.1-cp311-cp311-win_amd64.whl:
Publisher:
release.yml on rustakka/rakka-inference
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
rakka_inference-0.2.1-cp311-cp311-win_amd64.whl -
Subject digest:
fda0972e0e29f0b401ce25db41330b5f8e16c0f0bb225a88e3ff06204fe66501 - Sigstore transparency entry: 1436188850
- Sigstore integration time:
-
Permalink:
rustakka/rakka-inference@b10d162253b95f5c8e3f7a0834b02f5b91c805f3 -
Branch / Tag:
refs/tags/v0.2.3 - Owner: https://github.com/rustakka
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
release.yml@b10d162253b95f5c8e3f7a0834b02f5b91c805f3 -
Trigger Event:
push
-
Statement type:
File details
Details for the file rakka_inference-0.2.1-cp311-cp311-musllinux_1_2_x86_64.whl.
File metadata
- Download URL: rakka_inference-0.2.1-cp311-cp311-musllinux_1_2_x86_64.whl
- Upload date:
- Size: 444.5 kB
- Tags: CPython 3.11, musllinux: musl 1.2+ x86-64
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
8fc1765d65781a76332c568f1e44a15551981dc5ada824bacb2b47c9499cb3e5
|
|
| MD5 |
20fe724a8a058243ef018fc939879c8d
|
|
| BLAKE2b-256 |
791fe5eda6f6222957e8a488c04643267eda0c9795e81a08e3e737dd638b9d4a
|
Provenance
The following attestation bundles were made for rakka_inference-0.2.1-cp311-cp311-musllinux_1_2_x86_64.whl:
Publisher:
release.yml on rustakka/rakka-inference
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
rakka_inference-0.2.1-cp311-cp311-musllinux_1_2_x86_64.whl -
Subject digest:
8fc1765d65781a76332c568f1e44a15551981dc5ada824bacb2b47c9499cb3e5 - Sigstore transparency entry: 1436188873
- Sigstore integration time:
-
Permalink:
rustakka/rakka-inference@b10d162253b95f5c8e3f7a0834b02f5b91c805f3 -
Branch / Tag:
refs/tags/v0.2.3 - Owner: https://github.com/rustakka
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
release.yml@b10d162253b95f5c8e3f7a0834b02f5b91c805f3 -
Trigger Event:
push
-
Statement type:
File details
Details for the file rakka_inference-0.2.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.
File metadata
- Download URL: rakka_inference-0.2.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
- Upload date:
- Size: 230.5 kB
- Tags: CPython 3.11, manylinux: glibc 2.17+ x86-64
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
e550de38498dfa3b0be2a8b64746233cf728629329a59590b497d75f09318dda
|
|
| MD5 |
24b3cc0260f2420ac63424d4d0207232
|
|
| BLAKE2b-256 |
ec56d607ce08cdd46190cb1d5689dfdb3fceca2ac8351fba9ced021c4bf0f661
|
Provenance
The following attestation bundles were made for rakka_inference-0.2.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl:
Publisher:
release.yml on rustakka/rakka-inference
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
rakka_inference-0.2.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl -
Subject digest:
e550de38498dfa3b0be2a8b64746233cf728629329a59590b497d75f09318dda - Sigstore transparency entry: 1436188866
- Sigstore integration time:
-
Permalink:
rustakka/rakka-inference@b10d162253b95f5c8e3f7a0834b02f5b91c805f3 -
Branch / Tag:
refs/tags/v0.2.3 - Owner: https://github.com/rustakka
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
release.yml@b10d162253b95f5c8e3f7a0834b02f5b91c805f3 -
Trigger Event:
push
-
Statement type:
File details
Details for the file rakka_inference-0.2.1-cp311-cp311-macosx_10_12_x86_64.macosx_11_0_arm64.macosx_10_12_universal2.whl.
File metadata
- Download URL: rakka_inference-0.2.1-cp311-cp311-macosx_10_12_x86_64.macosx_11_0_arm64.macosx_10_12_universal2.whl
- Upload date:
- Size: 401.2 kB
- Tags: CPython 3.11, macOS 10.12+ universal2 (ARM64, x86-64), macOS 10.12+ x86-64, macOS 11.0+ ARM64
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
c6847444dcd595646cb1370c2e174029ac93e15822b5e162b481539727b0238d
|
|
| MD5 |
f4a6517baa2fe8a8c512c4a55fadee5f
|
|
| BLAKE2b-256 |
9f906fc9cec2571bf8cbae8865c1cb1834674b710643d51a53747f97bfc83e79
|
Provenance
The following attestation bundles were made for rakka_inference-0.2.1-cp311-cp311-macosx_10_12_x86_64.macosx_11_0_arm64.macosx_10_12_universal2.whl:
Publisher:
release.yml on rustakka/rakka-inference
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
rakka_inference-0.2.1-cp311-cp311-macosx_10_12_x86_64.macosx_11_0_arm64.macosx_10_12_universal2.whl -
Subject digest:
c6847444dcd595646cb1370c2e174029ac93e15822b5e162b481539727b0238d - Sigstore transparency entry: 1436188832
- Sigstore integration time:
-
Permalink:
rustakka/rakka-inference@b10d162253b95f5c8e3f7a0834b02f5b91c805f3 -
Branch / Tag:
refs/tags/v0.2.3 - Owner: https://github.com/rustakka
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
release.yml@b10d162253b95f5c8e3f7a0834b02f5b91c805f3 -
Trigger Event:
push
-
Statement type:
File details
Details for the file rakka_inference-0.2.1-cp310-cp310-win_amd64.whl.
File metadata
- Download URL: rakka_inference-0.2.1-cp310-cp310-win_amd64.whl
- Upload date:
- Size: 136.9 kB
- Tags: CPython 3.10, Windows x86-64
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
07a7f136902ed4d8322ebcf5640590b3fa79955ea5296eee7cdbcfb11d9f689e
|
|
| MD5 |
8f12a4e77c56d08e565493592b3f9774
|
|
| BLAKE2b-256 |
214c65ce679dd7217d533a3138fe534a34ceed74403e0fa91145929b4fa10bc7
|
Provenance
The following attestation bundles were made for rakka_inference-0.2.1-cp310-cp310-win_amd64.whl:
Publisher:
release.yml on rustakka/rakka-inference
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
rakka_inference-0.2.1-cp310-cp310-win_amd64.whl -
Subject digest:
07a7f136902ed4d8322ebcf5640590b3fa79955ea5296eee7cdbcfb11d9f689e - Sigstore transparency entry: 1436188872
- Sigstore integration time:
-
Permalink:
rustakka/rakka-inference@b10d162253b95f5c8e3f7a0834b02f5b91c805f3 -
Branch / Tag:
refs/tags/v0.2.3 - Owner: https://github.com/rustakka
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
release.yml@b10d162253b95f5c8e3f7a0834b02f5b91c805f3 -
Trigger Event:
push
-
Statement type:
File details
Details for the file rakka_inference-0.2.1-cp310-cp310-musllinux_1_2_x86_64.whl.
File metadata
- Download URL: rakka_inference-0.2.1-cp310-cp310-musllinux_1_2_x86_64.whl
- Upload date:
- Size: 444.5 kB
- Tags: CPython 3.10, musllinux: musl 1.2+ x86-64
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
db6087c4843dfde44656c3cd051ae607bb88b87da312d4ffd022049c20556726
|
|
| MD5 |
70d649749d10145132371be442f89fc2
|
|
| BLAKE2b-256 |
9ba371bc0f4f1c011f2decd4f5b425930ac7b494d609daa429d6ffe554628133
|
Provenance
The following attestation bundles were made for rakka_inference-0.2.1-cp310-cp310-musllinux_1_2_x86_64.whl:
Publisher:
release.yml on rustakka/rakka-inference
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
rakka_inference-0.2.1-cp310-cp310-musllinux_1_2_x86_64.whl -
Subject digest:
db6087c4843dfde44656c3cd051ae607bb88b87da312d4ffd022049c20556726 - Sigstore transparency entry: 1436188840
- Sigstore integration time:
-
Permalink:
rustakka/rakka-inference@b10d162253b95f5c8e3f7a0834b02f5b91c805f3 -
Branch / Tag:
refs/tags/v0.2.3 - Owner: https://github.com/rustakka
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
release.yml@b10d162253b95f5c8e3f7a0834b02f5b91c805f3 -
Trigger Event:
push
-
Statement type:
File details
Details for the file rakka_inference-0.2.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.
File metadata
- Download URL: rakka_inference-0.2.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
- Upload date:
- Size: 230.6 kB
- Tags: CPython 3.10, manylinux: glibc 2.17+ x86-64
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
5738c72ddc4e61045d9288a6cc87c07844d55ae639c602d44a3a8b1f52c46b10
|
|
| MD5 |
b9aa96365fddc6fba72d53e1cc53196d
|
|
| BLAKE2b-256 |
fcc5a0c042bba523c97443c73ce737927cb8ccf3d5e73bc7a79d38573307704d
|
Provenance
The following attestation bundles were made for rakka_inference-0.2.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl:
Publisher:
release.yml on rustakka/rakka-inference
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
rakka_inference-0.2.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl -
Subject digest:
5738c72ddc4e61045d9288a6cc87c07844d55ae639c602d44a3a8b1f52c46b10 - Sigstore transparency entry: 1436188864
- Sigstore integration time:
-
Permalink:
rustakka/rakka-inference@b10d162253b95f5c8e3f7a0834b02f5b91c805f3 -
Branch / Tag:
refs/tags/v0.2.3 - Owner: https://github.com/rustakka
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
release.yml@b10d162253b95f5c8e3f7a0834b02f5b91c805f3 -
Trigger Event:
push
-
Statement type:
File details
Details for the file rakka_inference-0.2.1-cp310-cp310-macosx_10_12_x86_64.macosx_11_0_arm64.macosx_10_12_universal2.whl.
File metadata
- Download URL: rakka_inference-0.2.1-cp310-cp310-macosx_10_12_x86_64.macosx_11_0_arm64.macosx_10_12_universal2.whl
- Upload date:
- Size: 401.4 kB
- Tags: CPython 3.10, macOS 10.12+ universal2 (ARM64, x86-64), macOS 10.12+ x86-64, macOS 11.0+ ARM64
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
e78a6085a39a9b421f799d447dd5b2cb5cf912661764cab1bc595990ea7cefbb
|
|
| MD5 |
cf62292c90c236c6a5b4334057f5beda
|
|
| BLAKE2b-256 |
d94e0d0e5adf4f90bc86d1ecf34bd40eb93ef21530d928a52d005c354171c28a
|
Provenance
The following attestation bundles were made for rakka_inference-0.2.1-cp310-cp310-macosx_10_12_x86_64.macosx_11_0_arm64.macosx_10_12_universal2.whl:
Publisher:
release.yml on rustakka/rakka-inference
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
rakka_inference-0.2.1-cp310-cp310-macosx_10_12_x86_64.macosx_11_0_arm64.macosx_10_12_universal2.whl -
Subject digest:
e78a6085a39a9b421f799d447dd5b2cb5cf912661764cab1bc595990ea7cefbb - Sigstore transparency entry: 1436188834
- Sigstore integration time:
-
Permalink:
rustakka/rakka-inference@b10d162253b95f5c8e3f7a0834b02f5b91c805f3 -
Branch / Tag:
refs/tags/v0.2.3 - Owner: https://github.com/rustakka
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
release.yml@b10d162253b95f5c8e3f7a0834b02f5b91c805f3 -
Trigger Event:
push
-
Statement type: