Skip to main content

Transparent, drop-in compression for Python's pickle — smaller files, same API

Project description

zpickle

Build Status GitHub Release License

Python Versions PyPI Version

Transparent, drop-in compression for Python's pickle — smaller files, same API.

zpickle adds high-performance compression to your serialized Python objects using multiple state-of-the-art algorithms without changing how you work with pickle.

# Replace this:
import pickle

# With this:
import zpickle as pickle

# Everything else stays the same!

Features

  • Drop-in replacement for the standard pickle module
  • Transparent compression — everything happens automatically
  • Multiple algorithms — choose zstd, brotli, zlib, or lzma (powered by compress_utils)
  • Configure once, use everywhere — set global defaults for your entire app
  • Smaller data — 2-10× smaller serialized data (depending on content and algorithm)
  • Backward compatible — automatically reads both compressed and regular pickle data
  • Complete API compatibility — all pickle functions work as expected

Installation

pip install zpickle

Quick Start

Basic Usage

import zpickle as pickle

# Serializing works exactly like pickle
data = {"complex": ["nested", {"data": "structure"}], "with": "lots of repetition"}
serialized = pickle.dumps(data)  # Automatically compressed!

# Deserializing works the same way
restored = pickle.loads(serialized)  # Automatically decompressed!

# File operations work too
with open("data.zpkl", "wb") as f:
    pickle.dump(data, f)

with open("data.zpkl", "rb") as f:
    restored = pickle.load(f)

Custom Configuration

import zpickle

# Configure global settings
zpickle.configure(algorithm='brotli', level=9)  # Higher compression

# Or configure for a single operation
data = [1, 2, 3] * 1000
compressed = zpickle.dumps(data, algorithm='zstd', level=6)

Performance

Compression ratios versus standard pickle (higher is better):

Bar graph showing data compression ratios versus pickle

Serialization speed (MB/s, higher is better):

Bar graph showing data compression speeds versus pickle

Note: Performance varies by data characteristics. Run benchmarks on your specific data for accurate results.

To run your own benchmarks, you can use:

python -m benchmarks.benchmark

How It Works

zpickle applies compression with minimal overhead:

  1. Objects are first serialized using standard pickle
  2. The pickle data is compressed using the selected algorithm
  3. A small header (7 bytes) is added to identify the format and algorithm
  4. When deserializing, zpickle auto-detects the format and decompresses if needed

API Reference

zpickle maintains complete API compatibility with the standard pickle module:

Core Functions

  • dumps(obj, protocol=None, ..., algorithm=None, level=None) - Serialize and compress object
  • loads(data, ...) - Deserialize and decompress object
  • dump(obj, file, protocol=None, ..., algorithm=None, level=None) - Serialize to file
  • load(file, ...) - Deserialize from file

Configuration

  • configure(algorithm=None, level=None, min_size=None) - Set global defaults
  • get_config() - Get current configuration

Classes

  • Pickler(file, ...) - Subclass of pickle.Pickler with compression
  • Unpickler(file, ...) - Subclass of pickle.Unpickler with decompression

Alternatives

  • Standard pickle: No compression, but native to Python
  • compressed_pickle: Similar concept, but less configurable
  • joblib: More focused on large NumPy arrays and parallel processing
  • msgpack, protobuf: Different serialization formats (not pickle-compatible)

License

This project is distributed under the MIT License. Read more >

Links

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

zpickle-1.0.0.tar.gz (544.5 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

zpickle-1.0.0-py3-none-any.whl (11.8 kB view details)

Uploaded Python 3

File details

Details for the file zpickle-1.0.0.tar.gz.

File metadata

  • Download URL: zpickle-1.0.0.tar.gz
  • Upload date:
  • Size: 544.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.12.9

File hashes

Hashes for zpickle-1.0.0.tar.gz
Algorithm Hash digest
SHA256 a04bd9fd63dc43a76a27b34676818746cf174be60d4de93e604256705fd82f31
MD5 8b1f2efe1dc2cc14c8a1168db4483091
BLAKE2b-256 88eceef46b12bd12fc40b3a4a90299670ee2687f80d89b7bc0b1fd6def2c8347

See more details on using hashes here.

Provenance

The following attestation bundles were made for zpickle-1.0.0.tar.gz:

Publisher: test_and_package_wheel.yml on dupontcyborg/zpickle

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file zpickle-1.0.0-py3-none-any.whl.

File metadata

  • Download URL: zpickle-1.0.0-py3-none-any.whl
  • Upload date:
  • Size: 11.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.12.9

File hashes

Hashes for zpickle-1.0.0-py3-none-any.whl
Algorithm Hash digest
SHA256 b11fb2cf191aff8aec94b3ef1bba4dd3f932a866de683f33fb0a910a890af368
MD5 8d4657f57357c7725013efe26032e162
BLAKE2b-256 9575e2ee94b52755087d08bc6bab85ffc5481ecb425386003927027fcb40c7f2

See more details on using hashes here.

Provenance

The following attestation bundles were made for zpickle-1.0.0-py3-none-any.whl:

Publisher: test_and_package_wheel.yml on dupontcyborg/zpickle

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page