Skip to main content

Universal cross-platform Python packager powered by uv

Project description

uv-bundler

Build deployment-ready Python artifacts for Linux — from any machine. No Docker, no Linux build server.

Data pipelines (Spark, Flink, Lambda) run on Linux. Your laptop runs macOS. Getting the right platform wheels into a deployment artifact has always required a dedicated Linux build environment — until now.


The problem

  • You need Linux wheels, but you're on macOS. A plain pip install fetches the wrong binaries for your host machine, not your target.
  • Switching to ARM (AWS Graviton, Apple Silicon) is painful. Build pipelines are hard-coded to x86_64 and fail silently on aarch64.
  • ZIP hacks break in production. Native extensions (NumPy, Pandas) crash with ImportError when bundled without the correct platform wheels.

The solution

uv-bundler uses Ghost Resolutionuv pip compile + uv pip install --python-platform — to resolve and download the correct Linux wheels on any host OS. No Docker, no cross-compilation, no remote build environment.

One command produces a self-contained artifact that runs on Linux x86_64 or aarch64:

uv-bundler --target spark-prod
# → dist/my-spark-job-linux-x86_64.jar

Run it on Linux with zero pre-installed packages:

python my-spark-job-linux-x86_64.jar
# Running single Spark job with pyspark 3.5.0

Quick start

1. Add a target to pyproject.toml:

[tool.uv-bundler]
project_name = "my-spark-job"
default_target = "spark-prod"

[tool.uv-bundler.targets.spark-prod]
format = "jar"
entry_point = "app.main:run"
platform = "linux"
arch = "x86_64"
python_version = "3.10"
manylinux = "2014"

2. Build from any machine:

# Verify config without building
uv-bundler --dry-run

# Build the artifact
uv-bundler --target spark-prod

3. Validate it works — no packages needed inside the container:

docker run --rm \
  -v "$(pwd)/dist:/artifacts" \
  python:3.10-slim \
  python /artifacts/my-spark-job-linux-x86_64.jar
# Running single Spark job with pyspark 3.5.0

Cross-arch builds

Build for a different architecture without any additional tooling:

# Build for ARM from any host (macOS, Linux x86_64, ...)
uv-bundler --target spark-prod --arch aarch64
# → dist/my-spark-job-linux-aarch64.jar

Ghost Resolution fetches manylinux2014_aarch64 wheels and bundles them directly — the host arch is irrelevant.


How it works

uv-bundler follows a 5-step lifecycle:

  1. Context hydration — reads pyproject.toml, merges CLI overrides, validates the entry point module path.
  2. Ghost Resolution — runs uv pip compile --python-platform <target> to pin platform-specific wheel hashes for the target OS, not the build host.
  3. Staging — runs uv pip install --target to extract wheels into a staging directory alongside your source code.
  4. Bootstrap generation — generates a __main__.py that correctly loads site-packages at runtime, including when executed directly from inside a zip archive (zipapp mode).
  5. Assembly — bundles everything into the requested format.

Artifact formats

JAR — Spark / Flink

  • Self-contained zipapp, runnable with python app.jar
  • Includes META-INF/MANIFEST.MF for JVM tooling compatibility
  • Dependencies bundled under site-packages/ inside the archive
  • Build fails if .dylib or .dll binaries are detected (Linux targets only)

ZIP — AWS Lambda / general deployment

  • Same internal layout as JAR (bootstrap + site-packages/ + sources)
  • Suited for Lambda Layers and generic zip-based deployments

PEX — Airflow / schedulers

  • Single-file executable Python environment via the pex CLI
  • Uses a platform tag (e.g. manylinux2014_x86_64-cp-310-cp310) for cross-platform resolution
  • First build downloads packages from PyPI; subsequent builds use the cache
  • Cache $PEX_ROOT (default ~/.pex) in CI to avoid repeated downloads

Configuration reference

[tool.uv-bundler]
project_name = "analytics-engine"    # Used in artifact filename
default_target = "spark-prod"         # Target used when --target is omitted
output_base = "./dist"                # Output directory (override via OUT_DIR env var)

[tool.uv-bundler.targets.spark-prod]
# Core
format = "jar"                        # "jar" | "pex" | "zip"
entry_point = "app.main:run"          # module.submodule:function

# Resolution (cross-platform)
platform = "linux"                    # "linux" | "macos" | "windows"
arch = "x86_64"                       # "x86_64" | "aarch64"
python_version = "3.10"
manylinux = "2014"                    # "2010" | "2014" | numeric (e.g. "31")

# Packaging
compression = "deflated"              # "stored" | "deflated"

# Advanced
exclude = ["tests/*", "**/__pycache__", "*.pyc", ".git*"]
extra_files = { "config/prod.yaml" = "resources/config.yaml" }

OUT_DIR environment variable overrides output_base and takes precedence over the TOML value.


CLI reference

uv-bundler [OPTIONS]

Options:
  --target TEXT           Target name from [tool.uv-bundler.targets.*]
  --platform TEXT         Override platform  (linux | macos | windows)
  --arch TEXT             Override arch      (x86_64 | aarch64)
  --python-version TEXT   Override Python version (e.g. 3.10)
  --manylinux TEXT        Override manylinux tag  (e.g. 2014)
  --dry-run               Print resolved config without building

Installation

pip install uv-bundler
# or
uv tool install uv-bundler   # installs uv-bundler as an isolated global CLI

Installation (development)

git clone https://github.com/amarlearning/uv-bundler.git
cd uv-bundler

# Install uv — https://docs.astral.sh/uv/
brew install uv   # macOS

make setup        # creates .venv and installs all dev deps

make fmt          # format
make lint         # ruff + mypy
make test         # unit + integration tests

Contributing

  • Branch: feature/<desc> or fix/<desc>
  • Commit: imperative mood, ≤ 50 chars summary
  • Quality gate: make fmt && make lint && make type && make test
  • PR: include rationale; update docs if behavior changes

License

Apache License 2.0

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

uv_bundler-1.0.0.tar.gz (109.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

uv_bundler-1.0.0-py3-none-any.whl (19.4 kB view details)

Uploaded Python 3

File details

Details for the file uv_bundler-1.0.0.tar.gz.

File metadata

  • Download URL: uv_bundler-1.0.0.tar.gz
  • Upload date:
  • Size: 109.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for uv_bundler-1.0.0.tar.gz
Algorithm Hash digest
SHA256 3b3c0403f56d7183ab7747229463043f904917fb7ea0144e2429a9b5d321c06a
MD5 b7ce9a1c3e68b0169b6d1e244975f34e
BLAKE2b-256 398ade61a6f13799583764bb126f7b8e1fa751fc4c82204eae1d85235ba98d22

See more details on using hashes here.

Provenance

The following attestation bundles were made for uv_bundler-1.0.0.tar.gz:

Publisher: release.yml on amarlearning/uv-bundler

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file uv_bundler-1.0.0-py3-none-any.whl.

File metadata

  • Download URL: uv_bundler-1.0.0-py3-none-any.whl
  • Upload date:
  • Size: 19.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for uv_bundler-1.0.0-py3-none-any.whl
Algorithm Hash digest
SHA256 3fdca10778389fbce156f9ecd44b6937b1824c3230d2db10846a174a0e60e9a6
MD5 b369f0ccec636e12c50d9a615bfcf437
BLAKE2b-256 2612334057bdb4df570753cb4f971fe96b4669565a5f83afcc8319eb0b3fe58b

See more details on using hashes here.

Provenance

The following attestation bundles were made for uv_bundler-1.0.0-py3-none-any.whl:

Publisher: release.yml on amarlearning/uv-bundler

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page