Skip to main content

Transparent, drop-in compression for Python's pickle — smaller files, same API

Project description

zpickle

Build Status GitHub Release License

Python Versions PyPI Version PyPI Downloads

Transparent, drop-in compression for Python's pickle — smaller files, same API.

zpickle adds high-performance compression to your serialized Python objects using multiple state-of-the-art algorithms without changing how you work with pickle.

# Replace this:
import pickle

# With this:
import zpickle as pickle

# Everything else stays the same!

Features

  • Drop-in replacement for the standard pickle module
  • Transparent compression — everything happens automatically
  • Multiple algorithms — choose zstd, brotli, zlib, lzma, bzip2, or lz4 (powered by compress_utils)
  • Configure once, use everywhere — set global defaults for your entire app
  • Smaller data — 2-10× smaller serialized data (depending on content and algorithm)
  • Backward compatible — automatically reads both compressed and regular pickle data
  • Complete API compatibility — all pickle functions work as expected

Installation

pip install zpickle

Quick Start

Basic Usage

import zpickle as pickle

# Serializing works exactly like pickle
data = {"complex": ["nested", {"data": "structure"}], "with": "lots of repetition"}
serialized = pickle.dumps(data)  # Automatically compressed!

# Deserializing works the same way
restored = pickle.loads(serialized)  # Automatically decompressed!

# File operations work too
with open("data.zpkl", "wb") as f:
    pickle.dump(data, f)

with open("data.zpkl", "rb") as f:
    restored = pickle.load(f)

Custom Configuration

import zpickle

# Configure global settings
zpickle.configure(algorithm='brotli', level=9)  # Higher compression

# Or configure for a single operation
data = [1, 2, 3] * 1000
compressed = zpickle.dumps(data, algorithm='zstd', level=6)

Performance

Compression ratios versus standard pickle (higher is better):

Bar graph showing data compression ratios versus pickle

Serialization speed (MB/s, higher is better):

Bar graph showing data compression speeds versus pickle

Note: Performance varies by data characteristics. Run benchmarks on your specific data for accurate results.

To run your own benchmarks, you can use:

python -m benchmarks.benchmark

How It Works

zpickle applies compression with minimal overhead:

  1. Objects are first serialized using standard pickle
  2. The pickle data is compressed using the selected algorithm
  3. A small header (8 bytes) is added to identify the format and algorithm
  4. When deserializing, zpickle auto-detects the format and decompresses if needed

API Reference

zpickle maintains complete API compatibility with the standard pickle module:

Core Functions

  • dumps(obj, protocol=None, ..., algorithm=None, level=None) - Serialize and compress object
  • loads(data, ...) - Deserialize and decompress object
  • dump(obj, file, protocol=None, ..., algorithm=None, level=None) - Serialize to file
  • load(file, ...) - Deserialize from file

Configuration

  • configure(algorithm=None, level=None, min_size=None) - Set global defaults
  • get_config() - Get current configuration

Classes

  • Pickler(file, ...) - Subclass of pickle.Pickler with compression
  • Unpickler(file, ...) - Subclass of pickle.Unpickler with decompression

Alternatives

  • Standard pickle: No compression, but native to Python
  • compressed_pickle: Similar concept, but less configurable
  • joblib: More focused on large NumPy arrays and parallel processing
  • msgpack, protobuf: Different serialization formats (not pickle-compatible)

License

This project is distributed under the MIT License. Read more >

Links

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

zpickle-1.2.0.tar.gz (547.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

zpickle-1.2.0-py3-none-any.whl (14.2 kB view details)

Uploaded Python 3

File details

Details for the file zpickle-1.2.0.tar.gz.

File metadata

  • Download URL: zpickle-1.2.0.tar.gz
  • Upload date:
  • Size: 547.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for zpickle-1.2.0.tar.gz
Algorithm Hash digest
SHA256 9cb11b5e0f24ddd68b1f3f63f3b2f0d0b285537480d76160f45250852fa74dea
MD5 50dcdba3c490b24c95623f8c40a6d103
BLAKE2b-256 4d9507b79540489f5151c8cc1889a3d992fe181784494477c3bb2f306b833582

See more details on using hashes here.

Provenance

The following attestation bundles were made for zpickle-1.2.0.tar.gz:

Publisher: test_and_package_wheel.yml on dupontcyborg/zpickle

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file zpickle-1.2.0-py3-none-any.whl.

File metadata

  • Download URL: zpickle-1.2.0-py3-none-any.whl
  • Upload date:
  • Size: 14.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for zpickle-1.2.0-py3-none-any.whl
Algorithm Hash digest
SHA256 96f31ec959d3006dbd60fb0ba7e9427cce470ca9ccee1d2c1604950ee61234a8
MD5 5f718c46303b9af9a63e28fa2e326019
BLAKE2b-256 1fc5e751dd17679de961084f32b80e1cd43512628fe0079c7049564857c1ca07

See more details on using hashes here.

Provenance

The following attestation bundles were made for zpickle-1.2.0-py3-none-any.whl:

Publisher: test_and_package_wheel.yml on dupontcyborg/zpickle

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page