Skip to main content

Extended pickling support for Python objects

Project description

cloudpickle

Build Status codecov.io

cloudpickle makes it possible to serialize Python constructs not supported by the default pickle module from the Python standard library.

cloudpickle is especially useful for cluster computing where Python code is shipped over the network to execute on remote hosts, possibly close to the data.

Among other things, cloudpickle supports pickling for lambda functions along with functions and classes defined interactively in the __main__ module (for instance in a script, a shell or a Jupyter notebook).

cloudpickle uses pickle.HIGHEST_PROTOCOL by default: it is meant to send objects between processes running the same version of Python.

Using cloudpickle for long-term object storage is not supported and discouraged.

Installation

The latest release of cloudpickle is available from pypi:

pip install cloudpickle

Examples

Pickling a lambda expression:

>>> import cloudpickle
>>> squared = lambda x: x ** 2
>>> pickled_lambda = cloudpickle.dumps(squared)

>>> import pickle
>>> new_squared = pickle.loads(pickled_lambda)
>>> new_squared(2)
4

Pickling a function interactively defined in a Python shell session (in the __main__ module):

>>> CONSTANT = 42
>>> def my_function(data):
...    return data + CONSTANT
...
>>> pickled_function = cloudpickle.dumps(my_function)
>>> pickle.loads(pickled_function)(43)
85

Running the tests

  • With tox, to test run the tests for all the supported versions of Python and PyPy:

    pip install tox
    tox
    

    or alternatively for a specific environment:

    tox -e py37
    
  • With py.test to only run the tests for your current version of Python:

    pip install -r dev-requirements.txt
    PYTHONPATH='.:tests' py.test
    

History

cloudpickle was initially developed by picloud.com and shipped as part of the client SDK.

A copy of cloudpickle.py was included as part of PySpark, the Python interface to Apache Spark. Davies Liu, Josh Rosen, Thom Neale and other Apache Spark developers improved it significantly, most notably to add support for PyPy and Python 3.

The aim of the cloudpickle project is to make that work available to a wider audience outside of the Spark ecosystem and to make it easier to improve it further notably with the help of a dedicated non-regression test suite.

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

cloudpickle-1.1.1.tar.gz (34.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

cloudpickle-1.1.1-py2.py3-none-any.whl (18.0 kB view details)

Uploaded Python 2Python 3

File details

Details for the file cloudpickle-1.1.1.tar.gz.

File metadata

  • Download URL: cloudpickle-1.1.1.tar.gz
  • Upload date:
  • Size: 34.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.13.0 pkginfo/1.5.0.1 requests/2.21.0 setuptools/40.8.0 requests-toolbelt/0.9.1 tqdm/4.32.1 CPython/3.7.3

File hashes

Hashes for cloudpickle-1.1.1.tar.gz
Algorithm Hash digest
SHA256 7d43c4d0c7e9735ee8a352c96f84031dabd6676170c4e5e0585a469cc4769f22
MD5 28d7135fe4de219a1a10611b702ad83d
BLAKE2b-256 873e6b008a3384d7b2e85ff38481532b460b064251bde2e9db35ec345ba90544

See more details on using hashes here.

File details

Details for the file cloudpickle-1.1.1-py2.py3-none-any.whl.

File metadata

  • Download URL: cloudpickle-1.1.1-py2.py3-none-any.whl
  • Upload date:
  • Size: 18.0 kB
  • Tags: Python 2, Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.13.0 pkginfo/1.5.0.1 requests/2.21.0 setuptools/40.8.0 requests-toolbelt/0.9.1 tqdm/4.32.1 CPython/3.7.3

File hashes

Hashes for cloudpickle-1.1.1-py2.py3-none-any.whl
Algorithm Hash digest
SHA256 d119fb6627e65d43541bdf927975a0f2a5d40074a30d691c8585f761d721bf49
MD5 1f7211a079c041a7d0b6f78c3f7df1cc
BLAKE2b-256 24fb4f92f8c0f40a0d728b4f3d5ec5ff84353e705d8ff5e3e447620ea98b06bd

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page