A high-performance local compiler cache daemon
Project description
zccache
A blazing fast compiler cache for C/C++ and Rust
Inspired by sccache, but optimized for local-first use with aggressive file metadata caching and filesystem watching.
Quick Install
curl -LsSf https://github.com/zackees/zccache/releases/latest/download/install.sh | sh
powershell -ExecutionPolicy Bypass -c "irm https://github.com/zackees/zccache/releases/latest/download/install.ps1 | iex"
Verify:
zccache --version
Performance
50 files per benchmark, median of 5 trials. Run it yourself: ./perf
Cache Hit (warm cache)
| Benchmark | Bare Compiler | sccache | zccache | vs sccache | vs bare |
|---|---|---|---|---|---|
| C++ single-file | 11.705s | 1.576s | 0.050s | 32x | 236x |
| C++ multi-file | 11.553s | 11.530s | 0.017s | 695x | 696x |
| C++ response-file (single) | 12.540s | 1.558s | 0.047s | 33x | 267x |
| C++ response-file (multi) | 12.049s | 12.434s | 0.019s | 669x | 648x |
| Rust build | 6.592s | 8.604s | 0.045s | 193x | 148x |
| Rust check | 3.716s | 5.922s | 0.049s | 121x | 76x |
Cache Miss (cold compile)
| Benchmark | Bare Compiler | sccache | zccache | vs sccache | vs bare |
|---|---|---|---|---|---|
| C++ single-file | 12.641s | 20.632s | 13.430s | 1.5x | 0.9x |
| C++ multi-file | 11.358s | 11.759s | 12.867s | 0.9x | 0.9x |
| C++ response-file (single) | 12.063s | 20.607s | 14.087s | 1.5x | 0.9x |
| C++ response-file (multi) | 13.030s | 25.303s | 13.975s | 1.8x | 0.9x |
| Rust build | 7.119s | 10.023s | 8.507s | 1.2x | 0.8x |
| Rust check | 4.289s | 7.056s | 5.060s | 1.4x | 0.8x |
Benchmark details
- Single-file = 50 sequential
clang++ -c unit.cppinvocations - Multi-file = one
clang++ -c *.cppinvocation (sccache cannot cache these — its "warm" time is a full recompile) - Response-file = args via nested
.rspfiles: 200-Ddefines + 50-Ipaths + 30 warning flags (~283 expanded args) - Rust build =
--emit=dep-info,metadata,link(cargo build) - Rust check =
--emit=dep-info,metadata(cargo check) - Cold = first compile (empty cache). Warm = median of 5 subsequent runs.
- sccache gets cache hits but each hit still costs ~170ms subprocess overhead. zccache serves hits in ~1ms via in-process IPC.
Why is zccache so much faster on warm hits?
The difference comes from architecture, not better caching:
| sccache | zccache | |
|---|---|---|
| IPC model | Subprocess per invocation (fork + exec + connect) | Persistent daemon, single IPC message per compile |
| Cache lookup | Client hashes inputs, sends to server, server checks disk | Daemon has inputs in memory (file watcher + metadata cache) |
| On hit | Server reads artifact from disk, sends back via IPC | Daemon hardlinks cached file to output path (1 syscall) |
| Multi-file | Compiles every file (no multi-file cache support) | Parallel per-file cache lookups, only misses go to the compiler |
| Per-hit cost | ~170ms (process spawn + hash + disk I/O + IPC) | ~1ms (in-memory lookup + hardlink) |
Architecture enhancements that make the difference:
- Filesystem watcher — a background
notifywatcher tracks file changes in real time, so the daemon already knows whether inputs are dirty before you even invoke a compile. No redundant stat/hash work on hit. - In-memory metadata cache — file sizes, mtimes, and content hashes live in a lock-free
DashMap. Cache key computation is a memory lookup, not disk I/O. - Single-roundtrip IPC — each compile is one length-prefixed bincode message over a Unix socket (or named pipe on Windows). No subprocess spawning, no repeated handshakes.
- Hardlink delivery — cache hits are served by hardlinking the cached artifact to the output path — a single syscall instead of reading + writing the file contents.
- Multi-file fast path — when a build system passes N source files in one invocation, zccache checks all N against the cache in parallel, serves hits immediately, and batches only the misses into a single compiler process.
Broader tool coverage — zccache supports modes that other compiler caches don't:
| Mode | Description |
|---|---|
| Multi-file compilation | clang++ -c a.cpp b.cpp c.cpp — per-file caching with parallel lookups |
| Response files | Nested .rsp files with hundreds of flags — fully expanded and cached |
| clang-tidy | Static analysis results cached and replayed |
| include-what-you-use | IWYU output cached per translation unit |
| Emscripten (emcc/em++) | WebAssembly compilation cached end-to-end |
| wasm-ld | WebAssembly linking cached |
| rustfmt | Formatting results cached |
| clippy | Lint results cached |
| Rust check & build | cargo check and cargo build with extern crate content hashing |
Install
curl -LsSf https://github.com/zackees/zccache/releases/latest/download/install.sh | sh
powershell -ExecutionPolicy Bypass -c "irm https://github.com/zackees/zccache/releases/latest/download/install.ps1 | iex"
This installs the standalone native Rust binaries (zccache, zccache-daemon,
and zccache-fp) directly from GitHub Releases.
Default install locations:
- Linux/macOS user install:
~/.local/bin - Linux/macOS global install:
/usr/local/bin - Windows user install:
%USERPROFILE%\.local\bin - Windows global install:
%ProgramFiles%\zccache\bin
Global install examples:
curl -LsSf https://github.com/zackees/zccache/releases/latest/download/install.sh | sudo sh -s -- --global
powershell -ExecutionPolicy Bypass -c "$env:ZCCACHE_INSTALL_MODE='global'; irm https://github.com/zackees/zccache/releases/latest/download/install.ps1 | iex"
Each GitHub release also publishes standalone per-platform archives:
- Linux:
zccache-vX.Y.Z-x86_64-unknown-linux-musl.tar.gz,zccache-vX.Y.Z-aarch64-unknown-linux-musl.tar.gz - macOS:
zccache-vX.Y.Z-x86_64-apple-darwin.tar.gz,zccache-vX.Y.Z-aarch64-apple-darwin.tar.gz - Windows:
zccache-vX.Y.Z-x86_64-pc-windows-msvc.zip,zccache-vX.Y.Z-aarch64-pc-windows-msvc.zip
PyPI remains available if you prefer pip install zccache; those wheels also install
the native binaries directly onto your PATH. Pre-built wheels are available for:
| Platform | Architecture |
|---|---|
| Linux | x86_64, aarch64 |
| macOS | x86_64, Apple Silicon |
| Windows | x86_64 |
Verify the install:
zccache --version
Rust crates are also published on crates.io. The main installable/runtime crates are:
zccache-clizccache-daemonzccache-corezccache-hashzccache-protocolzccache-fscachezccache-artifact
Use it as a drop-in replacement for sccache — just substitute zccache:
Integration Summary
RUSTC_WRAPPER=zccache cargo build
export CC="zccache clang"
export CXX="zccache clang++"
- Rust: set
RUSTC_WRAPPER=zccacheor addrustc-wrapper = "zccache"to.cargo/config.toml. - Bash: export
RUSTC_WRAPPER,CC, andCXXonce in your shell or CI environment. - Python: pass
RUSTC_WRAPPER,CC, andCXXthroughsubprocessenv when invokingcargoorclang. - First commands to check:
zccache --version,zccache start,zccache status.
Rust zccache integration
Use zccache as Cargo's compiler wrapper:
# one-off invocation
RUSTC_WRAPPER=zccache cargo build
RUSTC_WRAPPER=zccache cargo check
# optional: start the daemon explicitly
zccache start
Add to .cargo/config.toml for automatic use:
[build]
rustc-wrapper = "zccache"
Recommended project-local config:
[build]
rustc-wrapper = "zccache"
[env]
ZCCACHE_DIR = { value = "/tmp/.zccache", force = false }
Supports --emit=metadata (cargo check), --emit=dep-info,metadata,link (cargo build),
extern crate content hashing, and cacheable crate types such as lib, rlib,
and staticlib. Proc-macro and binary crates are passed through without caching,
matching the usual sccache behavior.
Useful Rust workflow commands:
# inspect status
zccache status
# clear local cache
zccache clear
# validate wrapper is active
RUSTC_WRAPPER=zccache cargo clean
RUSTC_WRAPPER=zccache cargo check
zccache status
Bash integration
For shell-driven builds, export the wrapper once in your session or CI step:
export RUSTC_WRAPPER=zccache
export CC="zccache clang"
export CXX="zccache clang++"
zccache start
cargo build
ninja
If you want this active in interactive shells, add it to ~/.bashrc:
export RUSTC_WRAPPER=zccache
export PATH="$HOME/.local/bin:$PATH"
For per-build stats in Bash:
eval "$(zccache session-start --stats)"
cargo build
zccache session-end "$ZCCACHE_SESSION_ID"
Python integration
Python projects can use zccache when invoking Rust or C/C++ toolchains through
subprocess, build backends, or extension-module builds.
import os
import subprocess
env = os.environ.copy()
env["RUSTC_WRAPPER"] = "zccache"
env["CC"] = "zccache clang"
env["CXX"] = "zccache clang++"
subprocess.run(["cargo", "build", "--release"], check=True, env=env)
This is useful for:
setuptools-rustmaturinscikit-build-core- custom Python build/test harnesses that shell out to
cargo,clang, orclang++
Example with maturin:
RUSTC_WRAPPER=zccache maturin build
Example with Python driving cargo check:
subprocess.run(["cargo", "check"], check=True, env=env)
C/C++ build system integration (ninja, meson, cmake, make)
zccache is a drop-in compiler wrapper. Point your build system's compiler
at zccache <real-compiler> and it handles the rest:
# meson native file
[binaries]
c = ['zccache', '/usr/bin/clang']
cpp = ['zccache', '/usr/bin/clang++']
# CMake
set(CMAKE_C_COMPILER_LAUNCHER zccache)
set(CMAKE_CXX_COMPILER_LAUNCHER zccache)
The first build (cold cache) runs at near-bare speed. Subsequent rebuilds
(ninja -t clean && ninja, or touching source files) serve cached artifacts
via hardlinks in under a second.
Single-roundtrip IPC: In drop-in mode, zccache sends a single
CompileEphemeral message that combines session creation, compilation, and
session teardown — eliminating 2 of 3 IPC roundtrips per invocation.
Session stats: Track hit rates per-build with --stats:
eval $(zccache session-start --stats --log build.log)
export ZCCACHE_SESSION_ID=...
# ... build runs ...
zccache session-stats $ZCCACHE_SESSION_ID # query mid-build
zccache session-end $ZCCACHE_SESSION_ID # final stats
Persistent cache: Artifacts are stored in ~/.zccache/artifacts/
and survive daemon restarts. No need to re-warm the cache after a reboot.
Compile journal (build replay): Every compile and link command is recorded
to ~/.zccache/logs/compile_journal.jsonl as a JSONL file with enough
detail to replay the entire build:
{"ts":"2026-03-17T10:30:00.123Z","outcome":"hit","compiler":"/usr/bin/clang++","args":["-c","foo.cpp","-o","foo.o"],"cwd":"/project/build","env":[["CC","clang"]],"exit_code":0,"session_id":"uuid","latency_ns":1234567}
Fields: ts (ISO 8601 UTC), outcome (hit/miss/error/link_hit/link_miss),
compiler (full path), args (full argument list), cwd, env (omitted when
inheriting daemon env), exit_code, session_id (null for ephemeral),
latency_ns (wall-clock nanoseconds). One JSON object per line — pipe through
jq to filter, or replay builds by extracting compiler + args + cwd.
Per-session compile journal: Pass --journal <path> to session-start to
write a dedicated JSONL log containing only the commands from that session.
The path must end in .jsonl:
result=$(zccache session-start --journal build.jsonl)
session_id=$(echo "$result" | jq -r .session_id)
export ZCCACHE_SESSION_ID=$session_id
# ... build runs ...
# Inspect this session's commands only (no noise from other sessions)
jq . build.jsonl
zccache session-end $session_id
The session journal uses the same JSONL schema as the global journal. Entries
are written to both the global and session journals simultaneously. The session
file handle is released when session-end is called.
Multi-file compilation (fast path)
When a build system passes multiple source files to a single compiler invocation
(e.g. gcc -c a.cpp b.cpp c.cpp -o ...), zccache treats this as a fast path:
- Each source file is checked against the cache in parallel.
- Cache hits are served immediately — their
.ofiles are written from the cache. - Remaining cache misses are batched into a single compiler process, preserving the compiler's own process-reuse and memory-sharing benefits.
- The outputs of the batched compilation are cached individually for future hits.
This hybrid approach means the first build populates the cache per-file, and subsequent builds serve as many files as possible from cache while still letting the compiler handle misses efficiently in bulk.
Recommendation: Configure your build system to pass multiple source files per compiler invocation whenever possible. This gives zccache the best opportunity to parallelize cache lookups and minimize compiler launches.
Concurrency
The daemon uses lock-free concurrent data structures (DashMap) for artifact and metadata lookups, so parallel compilation requests from multiple build workers never serialize on a global lock.
Status
Early development — architecture and scaffolding phase.
Goals
- Extremely fast on local machines (daemon keeps caches warm)
- Portable across Linux, macOS, and Windows
- Correct under heavy parallel compilation (no stale cache hits)
- Simple deployment (single binary)
Tool Compatibility
zccache works as a drop-in wrapper for these compilers and tools:
- Clang Toolchain: clang, clang-tidy, IWYU
- Emscripten / WebAssembly: emcc, wasm-ld
- Rust Toolchain: rustc, rustfmt, clippy
Architecture
See docs/ARCHITECTURE.md for the full system design.
Key components
| Crate | Purpose |
|---|---|
zccache-cli |
Command-line interface (zccache binary) |
zccache-daemon |
Daemon process (IPC server, orchestration) |
zccache-core |
Shared types, errors, config, path utilities |
zccache-protocol |
IPC message types and serialization |
zccache-ipc |
Transport layer (Unix sockets / named pipes) |
zccache-hash |
blake3 hashing and cache key computation |
zccache-fscache |
In-memory file metadata cache |
zccache-artifact |
Disk-backed artifact store with redb index |
zccache-watcher |
File watcher subsystem: daemon notify pipeline plus Rust-backed Python watcher bindings |
zccache-compiler |
Compiler detection and argument parsing |
zccache-test-support |
Test utilities and fixtures |
Building
cargo build --workspace
Testing
cargo test --workspace
Documentation
Watcher APIs
zccache exposes watcher-related APIs in three different places, depending on how you want to consume change detection:
- CLI:
zccache fp ...for daemon-backed fingerprint checks in scripts and CI - Python:
zccache.watcherfor cross-platform library-style file watching - Rust:
zccache-watcherfor the daemon-facing watcher pipeline primitives
CLI API
The CLI watcher entrypoint is the fingerprint API. It answers "should I rerun?" by consulting the daemon's in-memory watch state and cached file fingerprints.
zccache fp --cache-file .cache/headers.json check \
--root . \
--include '**/*.cpp' \
--include '**/*.h' \
--exclude build \
--exclude .git
Exit codes:
0: files changed, run the expensive step1: no changes detected, skip the step
After a successful or failed run, update the daemon's watch state:
zccache fp --cache-file .cache/headers.json mark-success
zccache fp --cache-file .cache/headers.json mark-failure
zccache fp --cache-file .cache/headers.json invalidate
The fingerprint API is the best fit for shell scripts, CI jobs, and build steps that only need a yes/no change answer rather than a stream of file events.
Python API
pip install zccache now exposes an importable zccache module in addition to
the native binaries. The Python surface is aimed at the same hot-path features
the CLI already exposes: watcher events, fingerprint decisions, daemon/session
control, downloads, and Arduino .ino conversion.
from zccache.client import ZcCacheClient
from zccache.fingerprint import FingerprintCache
from zccache.ino import convert_ino
from zccache.watcher import watch_files
client = ZcCacheClient()
client.start()
fp = FingerprintCache(".cache/watch.json")
decision = fp.check(
root=".",
include=["**/*.cpp", "**/*.hpp", "**/*.ino"],
exclude=["**/.build/**", "**/fastled_js/**"],
)
if decision.should_run:
convert_ino("Blink.ino", "build/Blink.ino.cpp")
fp.mark_success()
The watcher API remains polling- and callback-friendly, while the backend runs the filesystem scan loop in Rust and only crosses into Python when delivering events.
from zccache.watcher import watch_files
watcher = watch_files(
".",
include_folders=["src", "include"],
include_globs=["src/**/*.cpp", "include/**/*.h"],
exclude_globs=["build", "dist/**", ".git"],
debounce_seconds=0.2,
poll_interval=0.1,
)
event = watcher.poll(timeout=1.0)
if event is not None:
print(event.paths)
watcher.stop()
For explicit lifecycle control, use the class API:
from zccache.watcher import FileWatcher
watcher = FileWatcher(".", include_globs=["**/*.cpp"], autostart=False)
watcher.start()
event = watcher.poll(timeout=1.0)
watcher.stop()
watcher.resume()
watcher.stop()
Python watcher features:
include_foldersto narrow the scan rootsinclude_globsto include only matching filesexclude_globs/excluded_patternsto skip directories or filesdebounce_secondsto coalesce bursts of edits- optional
notification_predicateapplied at Python delivery time - callback API plus polling API
- explicit
start(),stop(),resume(), and context-manager support
Daemon/session control is also available without shelling out per call:
from zccache.client import ZcCacheClient
client = ZcCacheClient()
client.start()
session = client.session_start(cwd=".", track_stats=True)
stats = client.session_stats(session.session_id)
client.session_end(session.session_id)
And fingerprint state can be managed directly from Python:
from zccache.fingerprint import FingerprintCache
fp = FingerprintCache(".cache/lint.json", cache_type="two-layer")
decision = fp.check(root=".", include=["**/*.cpp"], exclude=["**/.build/**"])
if decision.should_run:
fp.mark_success()
Compatibility wrappers used by fastled-wasm are also available:
FileWatcherProcessDebouncedFileWatcherProcesswatch_filesFileWatcher
See crates/zccache-watcher/README.md for the full Python watcher surface.
Rust API
For Rust consumers, the public watcher crate is zccache-watcher.
It now exposes both the daemon-facing watcher pipeline and a library-style
polling watcher API:
-
PollingWatcherConfig -
PollingWatcher -
PollWatchBatch -
PollWatchObserver -
IgnoreFilterfor directory-name-based filtering -
NotifyWatcherfornotify-backed OS watch registration -
SettleBufferandSettledEventfor burst coalescing -
OverflowRecoveryfor overflow-driven rescan scheduling -
WatchEventandWatcherConfigfor event/config plumbing
Example:
use std::time::Duration;
use zccache_watcher::{PollingWatcher, PollingWatcherConfig};
let mut config = PollingWatcherConfig::new(".");
config.include_globs = vec!["**/*.cpp".to_string()];
config.poll_interval = Duration::from_millis(50);
config.debounce = Duration::from_millis(50);
let watcher = PollingWatcher::new(config)?;
watcher.start()?;
let batch = watcher.poll_timeout(Duration::from_secs(1))?;
watcher.stop()?;
Downloader APIs
zccache also exposes the dedicated download subsystem in three places:
- CLI:
zccache download ...on the main binary, plus the standalonezccache-downloadtool - Python:
zccache.downloader.DownloadApi - Rust:
zccache-download-clientfor the client API andzccache-downloadfor shared download types
The downloader daemon is separate from the compiler-cache daemon. It is meant for long-lived artifact downloads, deterministic cache paths, optional unarchiving, and attach/wait/status flows from multiple clients.
Downloader CLI
The main zccache binary includes a simple download subcommand:
zccache download \
https://example.com/toolchain.tar.zst \
--unarchive .cache/toolchain \
--sha256 0123456789abcdef \
--multipart-parts 8
That path blocks until the artifact is ready and prints the resolved cache path, SHA-256, and optional unarchive destination.
For daemon lifecycle control, attach/wait/status operations, JSON output, and explicit archive-format selection, use the standalone downloader CLI:
zccache-download daemon start
zccache-download fetch \
https://example.com/toolchain.tar.zst \
.cache/downloads/toolchain.tar.zst \
--expanded .cache/toolchain \
--archive-format tar.zst \
--max-connections 8
zccache-download exists \
https://example.com/toolchain.tar.zst \
.cache/downloads/toolchain.tar.zst
zccache-download --json daemon status
Additional standalone subcommands:
getto attach to a raw download handlewait,status, andcancelfor handle lifecycle operationsdaemon stopto shut the download daemon down explicitly
Python Downloader API
pip install zccache exposes the downloader as zccache.downloader.
from zccache.downloader import DownloadApi
api = DownloadApi()
api.start()
result = api.download(
source_url="https://example.com/toolchain.tar.zst",
destination=".cache/downloads/toolchain.tar.zst",
expanded=".cache/toolchain",
archive_format="tar.zst",
multipart_parts=8,
)
print(result.status, result.sha256, result.expanded_path)
state = api.exists(
source_url="https://example.com/toolchain.tar.zst",
destination=".cache/downloads/toolchain.tar.zst",
)
print(state.kind, state.reason)
If you need attach/wait/status semantics instead of a blocking fetch call, use
DownloadApi.attach(...) and operate on the returned DownloadHandle:
from zccache.downloader import DownloadApi
api = DownloadApi()
with api.attach(
source_url="https://example.com/toolchain.tar.zst",
destination=".cache/downloads/toolchain.tar.zst",
max_connections=8,
) as handle:
status = handle.wait(timeout_ms=1_000)
print(handle.download_id, status.phase, status.downloaded_bytes)
The Python downloader surface includes:
DownloadApi.start(),stop(), anddaemon_status()DownloadApi.download()/fetch()for blocking or non-blocking fetchesDownloadApi.exists()for cache-state checksDownloadApi.attach()plusDownloadHandle.status(),wait(), andcancel()
Rust Downloader API
For Rust code, use zccache-download-client as the entrypoint and
zccache-download for shared status and option types.
use std::path::PathBuf;
use zccache_download_client::{ArchiveFormat, DownloadClient, FetchRequest, WaitMode};
let client = DownloadClient::new(None);
client.start_daemon()?;
let mut request = FetchRequest::new(
"https://example.com/toolchain.tar.zst",
PathBuf::from(".cache/downloads/toolchain.tar.zst"),
);
request.destination_path_expanded = Some(PathBuf::from(".cache/toolchain"));
request.archive_format = ArchiveFormat::TarZst;
request.multipart_parts = Some(8);
request.wait_mode = WaitMode::Block;
let result = client.fetch(request)?;
println!("{:?} {} {}", result.status, result.sha256, result.cache_path.display());
For handle-based control, use DownloadClient::download(...):
use std::path::Path;
use zccache_download::DownloadOptions;
use zccache_download_client::DownloadClient;
let client = DownloadClient::new(None);
let mut handle = client.download(
"https://example.com/toolchain.tar.zst",
Path::new(".cache/downloads/toolchain.tar.zst"),
DownloadOptions {
force: false,
max_connections: Some(8),
min_segment_size: None,
},
)?;
let status = handle.wait(Some(1_000))?;
println!("{:?} {}", status.phase, status.downloaded_bytes);
The Rust downloader surface includes:
DownloadClient::start_daemon(),stop_daemon(), anddaemon_status()DownloadClient::fetch()andexists()withFetchRequestDownloadClient::download()returning aDownloadHandleArchiveFormat,FetchResult,FetchState,FetchStatus, andWaitModeDownloadOptions,DownloadStatus, andDownloadDaemonStatus
License
Licensed under either of Apache License, Version 2.0 or MIT license at your option.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distributions
Built Distributions
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file zccache-1.2.1-py3-none-win_arm64.whl.
File metadata
- Download URL: zccache-1.2.1-py3-none-win_arm64.whl
- Upload date:
- Size: 7.0 MB
- Tags: Python 3, Windows ARM64
- Uploaded using Trusted Publishing? No
- Uploaded via: uv/0.8.11
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
dbfdd85442211a4dd30ffb01e4817cb6939576afeb4212b77435f67544d1c58c
|
|
| MD5 |
8e70f9535857551d89c45bee4b97a49d
|
|
| BLAKE2b-256 |
be566060b72fdda2272967d1aeea9638133f7d7561b44d8cb4d5526063f06bb5
|
File details
Details for the file zccache-1.2.1-py3-none-win_amd64.whl.
File metadata
- Download URL: zccache-1.2.1-py3-none-win_amd64.whl
- Upload date:
- Size: 7.6 MB
- Tags: Python 3, Windows x86-64
- Uploaded using Trusted Publishing? No
- Uploaded via: uv/0.8.11
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
aa5aeb2865b8992e8287bd3ad7b170a08b2fced8bcd7d0f16865c176d6ad591f
|
|
| MD5 |
ce791f245f6f81857ef4233df50f3aa4
|
|
| BLAKE2b-256 |
e1c88cc7d757ba045d7c144f984a0a30b23bc125d6e72f211a0d453d37d433e2
|
File details
Details for the file zccache-1.2.1-py3-none-manylinux_2_17_x86_64.whl.
File metadata
- Download URL: zccache-1.2.1-py3-none-manylinux_2_17_x86_64.whl
- Upload date:
- Size: 8.7 MB
- Tags: Python 3, manylinux: glibc 2.17+ x86-64
- Uploaded using Trusted Publishing? No
- Uploaded via: uv/0.8.11
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
35133cd4b4ba1e5f3f7fec3c51d618149a233132fbf90adc848cc89b86fd7d86
|
|
| MD5 |
813d63e7a29f63226e62e84f1228a404
|
|
| BLAKE2b-256 |
a9199523275519d437a912d7904e89754ef963ab8335dcd7a4721073f24c1e2e
|
File details
Details for the file zccache-1.2.1-py3-none-manylinux_2_17_aarch64.whl.
File metadata
- Download URL: zccache-1.2.1-py3-none-manylinux_2_17_aarch64.whl
- Upload date:
- Size: 7.9 MB
- Tags: Python 3, manylinux: glibc 2.17+ ARM64
- Uploaded using Trusted Publishing? No
- Uploaded via: uv/0.8.11
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
99d23f559f89f7121339419cdd4e1db003e10ed50ff529eb54a34be1dc023cb5
|
|
| MD5 |
3aea6ac18c1aecc8bfab108b4d80aef0
|
|
| BLAKE2b-256 |
a4b480620341fc4b4de9ad6ab1acf554cf0afa61bad4582416acdec1df5036f7
|
File details
Details for the file zccache-1.2.1-py3-none-macosx_11_0_arm64.whl.
File metadata
- Download URL: zccache-1.2.1-py3-none-macosx_11_0_arm64.whl
- Upload date:
- Size: 7.9 MB
- Tags: Python 3, macOS 11.0+ ARM64
- Uploaded using Trusted Publishing? No
- Uploaded via: uv/0.8.11
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
8e38b4dbb9ef5491493369b0f3dcbca86512c712a3781cbef80c62e863b05d2c
|
|
| MD5 |
23b165f11280c533d7cb923de78eaea5
|
|
| BLAKE2b-256 |
dc55917bffee002501ff06802bc5a1d63b7167da13254f03c6eb807e3e76ce9f
|
File details
Details for the file zccache-1.2.1-py3-none-macosx_10_12_x86_64.whl.
File metadata
- Download URL: zccache-1.2.1-py3-none-macosx_10_12_x86_64.whl
- Upload date:
- Size: 8.2 MB
- Tags: Python 3, macOS 10.12+ x86-64
- Uploaded using Trusted Publishing? No
- Uploaded via: uv/0.8.11
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
ffa7d10b2877eaabf27139070e0ba1e2f2f5177edcea32d842477d4a64df9110
|
|
| MD5 |
7793f9f8a83003b30b971a65ffcffa8f
|
|
| BLAKE2b-256 |
21fb53db88e9056a3b926b05569fd79b25fe95abe82f65e131ba784fb8e60933
|