Universal cross-platform Python packager powered by uv
Project description
uv-bundler
Build deployment-ready Python artifacts for Linux — from any machine. No Docker, no Linux build server.
Data pipelines (Spark, Flink, Lambda) run on Linux. Your laptop runs macOS. Getting the right platform wheels into a deployment artifact has always required a dedicated Linux build environment — until now.
The problem
- You need Linux wheels, but you're on macOS. A plain
pip installfetches the wrong binaries for your host machine, not your target. - Switching to ARM (AWS Graviton, Apple Silicon) is painful. Build pipelines are hard-coded to x86_64 and fail silently on aarch64.
- ZIP hacks break in production. Native extensions (NumPy, Pandas) crash with
ImportErrorwhen bundled without the correct platform wheels.
The solution
uv-bundler uses Ghost Resolution — uv pip compile + uv pip install --python-platform — to resolve and download the correct Linux wheels on any host OS. No Docker, no cross-compilation, no remote build environment.
One command produces a self-contained artifact that runs on Linux x86_64 or aarch64:
uv-bundler --target spark-prod
# → dist/my-spark-job-linux-x86_64.jar
Run it on Linux with zero pre-installed packages:
python my-spark-job-linux-x86_64.jar
# Running single Spark job with pyspark 3.5.0
When your runtime environment already has dependencies installed (EMR clusters, Lambda layers, Airflow workers), use slim mode to bundle source code only:
uv-bundler --target spark-prod --slim
# → dist/my-spark-job-linux-x86_64-slim.jar
Quick start
1. Add a target to pyproject.toml:
[tool.uv-bundler]
project_name = "my-spark-job"
default_target = "spark-prod"
[tool.uv-bundler.targets.spark-prod]
format = "jar"
entry_point = "app.main:run"
platform = "linux"
arch = "x86_64"
python_version = "3.10"
manylinux = "2014"
2. Build from any machine:
# Verify config without building
uv-bundler --dry-run
# Build the artifact
uv-bundler --target spark-prod
3. Validate it works — no packages needed inside the container:
docker run --rm \
-v "$(pwd)/dist:/artifacts" \
python:3.10-slim \
python /artifacts/my-spark-job-linux-x86_64.jar
# Running single Spark job with pyspark 3.5.0
Cross-arch builds
Build for a different architecture without any additional tooling:
# Build for ARM from any host (macOS, Linux x86_64, ...)
uv-bundler --target spark-prod --arch aarch64
# → dist/my-spark-job-linux-aarch64.jar
Ghost Resolution fetches manylinux2014_aarch64 wheels and bundles them directly — the host arch is irrelevant.
How it works
uv-bundler follows a 5-step lifecycle:
- Context hydration — reads
pyproject.toml, merges CLI overrides, validates the entry point module path. - Ghost Resolution — runs
uv pip compile --python-platform <target>to pin platform-specific wheel hashes for the target OS, not the build host. (skipped in slim mode) - Staging — runs
uv pip install --targetto extract wheels into a staging directory alongside your source code. (skipped in slim mode) - Bootstrap generation — generates a
__main__.pythat correctly loadssite-packagesat runtime, including when executed directly from inside a zip archive (zipapp mode). - Assembly — bundles everything into the requested format. In slim mode, only source code is bundled — no
site-packages/.
Artifact formats
JAR — Spark / Flink
- Self-contained zipapp, runnable with
python app.jar - Includes
META-INF/MANIFEST.MFfor JVM tooling compatibility - Dependencies bundled under
site-packages/inside the archive - Build fails if
.dylibor.dllbinaries are detected (Linux targets only) - Slim: source only — use when pyspark/deps are pre-installed on the cluster
ZIP — AWS Lambda / general deployment
- Same internal layout as JAR (bootstrap +
site-packages/+ sources) - Suited for Lambda Layers and generic zip-based deployments
- Slim: source only — use when a Lambda Layer already provides dependencies
PEX — Airflow / schedulers
- Single-file executable Python environment via the
pexCLI - Uses a platform tag (e.g.
manylinux2014_x86_64-cp-310-cp310) for cross-platform resolution - First build downloads packages from PyPI; subsequent builds use the cache
- Cache
$PEX_ROOT(default~/.pex) in CI to avoid repeated downloads - Slim: source only — no
-r requirementspassed to pex; use when the runtime environment provides deps
Configuration reference
[tool.uv-bundler]
project_name = "analytics-engine" # Used in artifact filename
default_target = "spark-prod" # Target used when --target is omitted
output_base = "./dist" # Output directory (override via OUT_DIR env var)
[tool.uv-bundler.targets.spark-prod]
# Core
format = "jar" # "jar" | "pex" | "zip"
entry_point = "app.main:run" # module.submodule:function
# Resolution (cross-platform)
platform = "linux" # "linux" | "macos" | "windows"
arch = "x86_64" # "x86_64" | "aarch64"
python_version = "3.10"
manylinux = "2014" # "2010" | "2014" | numeric (e.g. "31")
# Packaging
compression = "deflated" # "stored" | "deflated"
# Slim mode — bundle source only, no dependencies
slim = false # true = skip resolve+install, source only
slim_exclude = ["tests/**", "*.pyi"] # additional patterns excluded from source in slim mode
# Advanced
exclude = ["tests/*", "**/__pycache__", "*.pyc", ".git*"]
extra_files = { "config/prod.yaml" = "resources/config.yaml" }
OUT_DIRenvironment variable overridesoutput_baseand takes precedence over the TOML value.
CLI reference
uv-bundler [OPTIONS]
Options:
--target TEXT Target name from [tool.uv-bundler.targets.*]
--platform TEXT Override platform (linux | macos | windows)
--arch TEXT Override arch (x86_64 | aarch64)
--python-version TEXT Override Python version (e.g. 3.10)
--manylinux TEXT Override manylinux tag (e.g. 2014)
--slim Bundle source only — skip dependency resolution and installation
--dry-run Print resolved config without building
--slim overrides slim = false in config. Slim artifacts are named with a -slim infix: my-job-linux-x86_64-slim.jar.
Installation
pip install uv-bundler
# or
uv tool install uv-bundler # installs uv-bundler as an isolated global CLI
Installation (development)
git clone https://github.com/amarlearning/uv-bundler.git
cd uv-bundler
# Install uv — https://docs.astral.sh/uv/
brew install uv # macOS
make setup # creates .venv and installs all dev deps
make fmt # format
make lint # ruff + mypy
make test # unit + integration tests
Contributing
- Branch:
feature/<desc>orfix/<desc> - Commit: imperative mood, ≤ 50 chars summary
- Quality gate:
make fmt && make lint && make type && make test - PR: include rationale; update docs if behavior changes
License
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file uv_bundler-1.1.0.tar.gz.
File metadata
- Download URL: uv_bundler-1.1.0.tar.gz
- Upload date:
- Size: 112.7 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
0a689978aced6093d865fe53747475d9102379b3dcecb09688b5ab2c850e12dd
|
|
| MD5 |
62b0195ead3da0419ed18160e73c5749
|
|
| BLAKE2b-256 |
0eb3bcec776d56cb22aa4f16a12d2063b341e6722e89178c1698398028e21f38
|
Provenance
The following attestation bundles were made for uv_bundler-1.1.0.tar.gz:
Publisher:
release.yml on amarlearning/uv-bundler
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
uv_bundler-1.1.0.tar.gz -
Subject digest:
0a689978aced6093d865fe53747475d9102379b3dcecb09688b5ab2c850e12dd - Sigstore transparency entry: 1108242104
- Sigstore integration time:
-
Permalink:
amarlearning/uv-bundler@be7cbcf51bf022248c73fca620fa82e08f690bc6 -
Branch / Tag:
refs/tags/v1.1.0 - Owner: https://github.com/amarlearning
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
release.yml@be7cbcf51bf022248c73fca620fa82e08f690bc6 -
Trigger Event:
push
-
Statement type:
File details
Details for the file uv_bundler-1.1.0-py3-none-any.whl.
File metadata
- Download URL: uv_bundler-1.1.0-py3-none-any.whl
- Upload date:
- Size: 20.2 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
992f787cadaaa7bf8abb1c27db04f8121d94948973e03da58c20a10cec40e5c9
|
|
| MD5 |
9bc05b66adc72455dc263b4644fc4fef
|
|
| BLAKE2b-256 |
2405c952d991369bdd1e0037f9576fae441a33c958e838a565260c78e695fa91
|
Provenance
The following attestation bundles were made for uv_bundler-1.1.0-py3-none-any.whl:
Publisher:
release.yml on amarlearning/uv-bundler
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
uv_bundler-1.1.0-py3-none-any.whl -
Subject digest:
992f787cadaaa7bf8abb1c27db04f8121d94948973e03da58c20a10cec40e5c9 - Sigstore transparency entry: 1108242163
- Sigstore integration time:
-
Permalink:
amarlearning/uv-bundler@be7cbcf51bf022248c73fca620fa82e08f690bc6 -
Branch / Tag:
refs/tags/v1.1.0 - Owner: https://github.com/amarlearning
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
release.yml@be7cbcf51bf022248c73fca620fa82e08f690bc6 -
Trigger Event:
push
-
Statement type: