Skip to main content

A high-performance local compiler cache daemon

Project description

zccache

Linux macOS Windows

C/C++ clang clang++ clang-tidy IWYU

Rust rustc clippy rustfmt

Emscripten emcc em++ wasm-ld

A blazing fast compiler cache for C/C++ and Rust

New Project

Inspired by sccache, but optimized for local-first use with aggressive file metadata caching and filesystem watching.

Performance

50 files per benchmark, median of 5 trials. Run it yourself: ./perf

Cache Hit (warm cache)

Benchmark Bare Compiler sccache zccache vs sccache vs bare
C++ single-file 11.705s 1.576s 0.050s 32x 236x
C++ multi-file 11.553s 11.530s 0.017s 695x 696x
C++ response-file (single) 12.540s 1.558s 0.047s 33x 267x
C++ response-file (multi) 12.049s 12.434s 0.019s 669x 648x
Rust build 6.592s 8.604s 0.045s 193x 148x
Rust check 3.716s 5.922s 0.049s 121x 76x

Cache Miss (cold compile)

Benchmark Bare Compiler sccache zccache vs sccache vs bare
C++ single-file 12.641s 20.632s 13.430s 1.5x 0.9x
C++ multi-file 11.358s 11.759s 12.867s 0.9x 0.9x
C++ response-file (single) 12.063s 20.607s 14.087s 1.5x 0.9x
C++ response-file (multi) 13.030s 25.303s 13.975s 1.8x 0.9x
Rust build 7.119s 10.023s 8.507s 1.2x 0.8x
Rust check 4.289s 7.056s 5.060s 1.4x 0.8x
Benchmark details
  • Single-file = 50 sequential clang++ -c unit.cpp invocations
  • Multi-file = one clang++ -c *.cpp invocation (sccache cannot cache these — its "warm" time is a full recompile)
  • Response-file = args via nested .rsp files: 200 -D defines + 50 -I paths + 30 warning flags (~283 expanded args)
  • Rust build = --emit=dep-info,metadata,link (cargo build)
  • Rust check = --emit=dep-info,metadata (cargo check)
  • Cold = first compile (empty cache). Warm = median of 5 subsequent runs.
  • sccache gets cache hits but each hit still costs ~170ms subprocess overhead. zccache serves hits in ~1ms via in-process IPC.

Why is zccache so much faster on warm hits?

The difference comes from architecture, not better caching:

sccache zccache
IPC model Subprocess per invocation (fork + exec + connect) Persistent daemon, single IPC message per compile
Cache lookup Client hashes inputs, sends to server, server checks disk Daemon has inputs in memory (file watcher + metadata cache)
On hit Server reads artifact from disk, sends back via IPC Daemon hardlinks cached file to output path (1 syscall)
Multi-file Compiles every file (no multi-file cache support) Parallel per-file cache lookups, only misses go to the compiler
Per-hit cost ~170ms (process spawn + hash + disk I/O + IPC) ~1ms (in-memory lookup + hardlink)

Architecture enhancements that make the difference:

  • Filesystem watcher — a background notify watcher tracks file changes in real time, so the daemon already knows whether inputs are dirty before you even invoke a compile. No redundant stat/hash work on hit.
  • In-memory metadata cache — file sizes, mtimes, and content hashes live in a lock-free DashMap. Cache key computation is a memory lookup, not disk I/O.
  • Single-roundtrip IPC — each compile is one length-prefixed bincode message over a Unix socket (or named pipe on Windows). No subprocess spawning, no repeated handshakes.
  • Hardlink delivery — cache hits are served by hardlinking the cached artifact to the output path — a single syscall instead of reading + writing the file contents.
  • Multi-file fast path — when a build system passes N source files in one invocation, zccache checks all N against the cache in parallel, serves hits immediately, and batches only the misses into a single compiler process.

Broader tool coverage — zccache supports modes that other compiler caches don't:

Mode Description
Multi-file compilation clang++ -c a.cpp b.cpp c.cpp — per-file caching with parallel lookups
Response files Nested .rsp files with hundreds of flags — fully expanded and cached
clang-tidy Static analysis results cached and replayed
include-what-you-use IWYU output cached per translation unit
Emscripten (emcc/em++) WebAssembly compilation cached end-to-end
wasm-ld WebAssembly linking cached
rustfmt Formatting results cached
clippy Lint results cached
Rust check & build cargo check and cargo build with extern crate content hashing

Install

pip install zccache

This installs native Rust binaries (zccache and zccache-daemon) directly onto your PATH — no Python runtime dependency. Pre-built wheels are available for:

Platform Architecture
Linux x86_64, aarch64
macOS x86_64, Apple Silicon
Windows x86_64

Verify the install:

zccache --version

Use it as a drop-in replacement for sccache — just substitute zccache:

Rust / Cargo integration

# cargo build (cached)
RUSTC_WRAPPER=zccache cargo build

# cargo check (cached)
RUSTC_WRAPPER=zccache cargo check

Add to .cargo/config.toml for automatic use:

[build]
rustc-wrapper = "zccache"

Supports --emit=metadata (cargo check), --emit=dep-info,metadata,link (cargo build), extern crate content hashing (dependency changes cause cache misses), and all cacheable crate types (lib, rlib, staticlib). Proc-macro and binary crates are passed through without caching (same as sccache).

C/C++ build system integration (ninja, meson, cmake, make)

zccache is a drop-in compiler wrapper. Point your build system's compiler at zccache <real-compiler> and it handles the rest:

# meson native file
[binaries]
c = ['zccache', '/usr/bin/clang']
cpp = ['zccache', '/usr/bin/clang++']
# CMake
set(CMAKE_C_COMPILER_LAUNCHER zccache)
set(CMAKE_CXX_COMPILER_LAUNCHER zccache)

The first build (cold cache) runs at near-bare speed. Subsequent rebuilds (ninja -t clean && ninja, or touching source files) serve cached artifacts via hardlinks in under a second.

Single-roundtrip IPC: In drop-in mode, zccache sends a single CompileEphemeral message that combines session creation, compilation, and session teardown — eliminating 2 of 3 IPC roundtrips per invocation.

Session stats: Track hit rates per-build with --stats:

eval $(zccache session-start --stats --log build.log)
export ZCCACHE_SESSION_ID=...
# ... build runs ...
zccache session-stats $ZCCACHE_SESSION_ID   # query mid-build
zccache session-end $ZCCACHE_SESSION_ID     # final stats

Persistent cache: Artifacts are stored in ~/.zccache/artifacts/ and survive daemon restarts. No need to re-warm the cache after a reboot.

Compile journal (build replay): Every compile and link command is recorded to ~/.zccache/logs/compile_journal.jsonl as a JSONL file with enough detail to replay the entire build:

{"ts":"2026-03-17T10:30:00.123Z","outcome":"hit","compiler":"/usr/bin/clang++","args":["-c","foo.cpp","-o","foo.o"],"cwd":"/project/build","env":[["CC","clang"]],"exit_code":0,"session_id":"uuid","latency_ns":1234567}

Fields: ts (ISO 8601 UTC), outcome (hit/miss/error/link_hit/link_miss), compiler (full path), args (full argument list), cwd, env (omitted when inheriting daemon env), exit_code, session_id (null for ephemeral), latency_ns (wall-clock nanoseconds). One JSON object per line — pipe through jq to filter, or replay builds by extracting compiler + args + cwd.

Per-session compile journal: Pass --journal <path> to session-start to write a dedicated JSONL log containing only the commands from that session. The path must end in .jsonl:

result=$(zccache session-start --journal build.jsonl)
session_id=$(echo "$result" | jq -r .session_id)
export ZCCACHE_SESSION_ID=$session_id

# ... build runs ...

# Inspect this session's commands only (no noise from other sessions)
jq . build.jsonl

zccache session-end $session_id

The session journal uses the same JSONL schema as the global journal. Entries are written to both the global and session journals simultaneously. The session file handle is released when session-end is called.

Multi-file compilation (fast path)

When a build system passes multiple source files to a single compiler invocation (e.g. gcc -c a.cpp b.cpp c.cpp -o ...), zccache treats this as a fast path:

  1. Each source file is checked against the cache in parallel.
  2. Cache hits are served immediately — their .o files are written from the cache.
  3. Remaining cache misses are batched into a single compiler process, preserving the compiler's own process-reuse and memory-sharing benefits.
  4. The outputs of the batched compilation are cached individually for future hits.

This hybrid approach means the first build populates the cache per-file, and subsequent builds serve as many files as possible from cache while still letting the compiler handle misses efficiently in bulk.

Recommendation: Configure your build system to pass multiple source files per compiler invocation whenever possible. This gives zccache the best opportunity to parallelize cache lookups and minimize compiler launches.

Concurrency

The daemon uses lock-free concurrent data structures (DashMap) for artifact and metadata lookups, so parallel compilation requests from multiple build workers never serialize on a global lock.

Status

Early development — architecture and scaffolding phase.

Goals

  • Extremely fast on local machines (daemon keeps caches warm)
  • Portable across Linux, macOS, and Windows
  • Correct under heavy parallel compilation (no stale cache hits)
  • Simple deployment (single binary)

Tool Compatibility

zccache works as a drop-in wrapper for these compilers and tools:

Architecture

See docs/ARCHITECTURE.md for the full system design.

Key components

Crate Purpose
zccache-cli Command-line interface (zccache binary)
zccache-daemon Daemon process (IPC server, orchestration)
zccache-core Shared types, errors, config, path utilities
zccache-protocol IPC message types and serialization
zccache-ipc Transport layer (Unix sockets / named pipes)
zccache-hash blake3 hashing and cache key computation
zccache-fscache In-memory file metadata cache
zccache-artifact Disk-backed artifact store with redb index
zccache-watcher File watcher abstraction (notify backend)
zccache-compiler Compiler detection and argument parsing
zccache-test-support Test utilities and fixtures

Building

cargo build --workspace

Testing

cargo test --workspace

Documentation

License

Licensed under either of Apache License, Version 2.0 or MIT license at your option.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distributions

If you're not sure about the file name format, learn more about wheel file names.

zccache-1.1.18-py3-none-win_arm64.whl (4.2 MB view details)

Uploaded Python 3Windows ARM64

zccache-1.1.18-py3-none-win_amd64.whl (4.6 MB view details)

Uploaded Python 3Windows x86-64

zccache-1.1.18-py3-none-musllinux_1_2_x86_64.whl (5.4 MB view details)

Uploaded Python 3musllinux: musl 1.2+ x86-64

zccache-1.1.18-py3-none-musllinux_1_2_aarch64.whl (5.0 MB view details)

Uploaded Python 3musllinux: musl 1.2+ ARM64

zccache-1.1.18-py3-none-manylinux_2_17_x86_64.whl (5.3 MB view details)

Uploaded Python 3manylinux: glibc 2.17+ x86-64

zccache-1.1.18-py3-none-manylinux_2_17_aarch64.whl (4.9 MB view details)

Uploaded Python 3manylinux: glibc 2.17+ ARM64

zccache-1.1.18-py3-none-macosx_11_0_arm64.whl (4.5 MB view details)

Uploaded Python 3macOS 11.0+ ARM64

zccache-1.1.18-py3-none-macosx_10_12_x86_64.whl (4.7 MB view details)

Uploaded Python 3macOS 10.12+ x86-64

File details

Details for the file zccache-1.1.18-py3-none-win_arm64.whl.

File metadata

File hashes

Hashes for zccache-1.1.18-py3-none-win_arm64.whl
Algorithm Hash digest
SHA256 bd59c03f64d8322c1ef0d8f7aae2abfcca000ce430203230d25e44ad78fc1d37
MD5 8e9ca4a645731f5b9c608ca3b5551dc8
BLAKE2b-256 aaaecffcf4fd426d8f9546e3b23317308e1f560e4fae36361fd190a2a8493324

See more details on using hashes here.

File details

Details for the file zccache-1.1.18-py3-none-win_amd64.whl.

File metadata

File hashes

Hashes for zccache-1.1.18-py3-none-win_amd64.whl
Algorithm Hash digest
SHA256 11c7d8d7868a29c51a00aa7c8c98d42c455c5b5c485dfcc19810a93fbb056639
MD5 35cec31c3736bffd8bf27b2d9dbb1e53
BLAKE2b-256 d6e2ede2a01e1b3c0ae2a086ef48b6c69c70ca4e3893f8539d1017778667925a

See more details on using hashes here.

File details

Details for the file zccache-1.1.18-py3-none-musllinux_1_2_x86_64.whl.

File metadata

File hashes

Hashes for zccache-1.1.18-py3-none-musllinux_1_2_x86_64.whl
Algorithm Hash digest
SHA256 8104430e7bef7569864d940cf812de46771f890e7c71bbbe12ae9d0e73ae64fb
MD5 5ffb9f7c359e559c349edac8f64e3e34
BLAKE2b-256 5118d72540242c9f4f5bac822c34d1c11a9f50486401e6e671d02acd333fa552

See more details on using hashes here.

File details

Details for the file zccache-1.1.18-py3-none-musllinux_1_2_aarch64.whl.

File metadata

File hashes

Hashes for zccache-1.1.18-py3-none-musllinux_1_2_aarch64.whl
Algorithm Hash digest
SHA256 f471affc220d3dfc1b2d49503eefa55c9c16304b79c4a6a18f29b98d60d5a662
MD5 a077b9a615fd6b92613513dc91d1c9df
BLAKE2b-256 4d2f831461b3340379dc3ec059ae17d25afecb22e1a4bb768caa5fa6fdfe33d1

See more details on using hashes here.

File details

Details for the file zccache-1.1.18-py3-none-manylinux_2_17_x86_64.whl.

File metadata

File hashes

Hashes for zccache-1.1.18-py3-none-manylinux_2_17_x86_64.whl
Algorithm Hash digest
SHA256 c20b600bfa48ebe02a4fdac4697e9c4dc4790ba8fc6d28b9163152bfe6438408
MD5 ca7f848ae4e4edbd717a60ea90063aff
BLAKE2b-256 b2132de9d72ba1800bed92a58bdde831bcf6b3d5cb41643f453dcf70ead92a09

See more details on using hashes here.

File details

Details for the file zccache-1.1.18-py3-none-manylinux_2_17_aarch64.whl.

File metadata

File hashes

Hashes for zccache-1.1.18-py3-none-manylinux_2_17_aarch64.whl
Algorithm Hash digest
SHA256 f10e3b716b6c0d5a514771e79fd996609f2c30a9bf4601b984d599fb893032bc
MD5 1952c17f40b7a1b9d10b5062776bed91
BLAKE2b-256 bd62f07adb8fde2e0b2d5aac9dd0336081d7f997b640bdc5f50942ede3960e9f

See more details on using hashes here.

File details

Details for the file zccache-1.1.18-py3-none-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for zccache-1.1.18-py3-none-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 79204289022935dd4d823205ed5b2df24347487022f5be4e380eca3aea21ba10
MD5 67c88b18c05cd0659777de5f61f4e9c5
BLAKE2b-256 8d3186f3bcb6d16b6311b18c0a83162d669b82bd9b18519af6aa76b25a5c9503

See more details on using hashes here.

File details

Details for the file zccache-1.1.18-py3-none-macosx_10_12_x86_64.whl.

File metadata

File hashes

Hashes for zccache-1.1.18-py3-none-macosx_10_12_x86_64.whl
Algorithm Hash digest
SHA256 cb85dfdf0939506c598fa7546d408fcb8113970d31ef3d8230e6a1c8d660f619
MD5 4cc8ad23f29bbe6e64bc408bc73ac4ca
BLAKE2b-256 7d166f5fd2095cec1790eae8b2efd466de4cb9279c4c5372e93681d8419f9cdc

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page