Skip to main content

Multiple dispatch in Python

Project description

Plum: Multiple Dispatch in Python

DOI CI Coverage Status Latest Docs Code style: black

Everybody likes multiple dispatch, just like everybody likes plums.

The design philosophy of Plum is to provide an implementation of multiple dispatch that is Pythonic, yet close to how Julia does it. See here for a comparison between Plum, multipledispatch, and multimethod.

Note: Plum 2 is now powered by Beartype! If you notice any issues with the new release, please open an issue.

Installation

Plum requires Python 3.8 or higher.

pip install plum-dispatch

Documentation

See here.

What's This?

Plum brings your type annotations to life:

from numbers import Number

from plum import dispatch


@dispatch
def f(x: str):
    return "This is a string!"


@dispatch
def f(x: int):
    return "This is an integer!"


@dispatch
def f(x: Number):
    return "This is a general number, but I don't know which type."
>>> f("1")
'This is a string!'

>>> f(1)
'This is an integer!'

>>> f(1.0)
'This is a number, but I don't know which type.'

>>> f(object())
NotFoundLookupError: `f(<object object at 0x7fd3b01cd330>)` could not be resolved.

Closest candidates are the following:
    f(x: str)
        <function f at 0x7fd400644ee0> @ /<ipython-input-2-c9f6cdbea9f3>:6
    f(x: int)
        <function f at 0x7fd3a0235ca0> @ /<ipython-input-2-c9f6cdbea9f3>:11
    f(x: numbers.Number)
        <function f at 0x7fd3a0235d30> @ /<ipython-input-2-c9f6cdbea9f3>:16

[!IMPORTANT] Dispatch, as implemented by Plum, is based on the positional arguments to a function. Keyword arguments are not used in the decision making for which method to call. In particular, this means that positional arguments without a default value must always be given as positional arguments!

Example:

from plum import dispatch

@dispatch
def f(x: int):
   return x

>>> f(1)        # OK
1

>> try: f(x=1)  # Not OK
... except Exception as e: print(f"{type(e).__name__}: {e}")
NotFoundLookupError: `f()` could not be resolved...

This also works for multiple arguments, enabling some neat design patterns:

from numbers import Number, Real, Rational

from plum import dispatch


@dispatch
def multiply(x: Number, y: Number):
    return "Performing fallback implementation of multiplication..."


@dispatch
def multiply(x: Real, y: Real):
    return "Performing specialised implementation for reals..."


@dispatch
def multiply(x: Rational, y: Rational):
    return "Performing specialised implementation for rationals..."
>>> multiply(1, 1)
'Performing specialised implementation for rationals...'

>>> multiply(1.0, 1.0)
'Performing specialised implementation for reals...'

>>> multiply(1j, 1j)
'Performing fallback implementation of multiplication...'

>>> multiply(1, 1.0)  # For mixed types, it automatically chooses the right optimisation!
'Performing specialised implementation for reals...'

Projects Using Plum

The following projects are using Plum to do multiple dispatch! Would you like to add your project here? Please feel free to open a PR to add it to the list!

  • Coordinax implements coordinates in JAX.
  • GPAR is an implementation of the Gaussian Process Autoregressive Model.
  • GPCM is an implementation of various Gaussian Process Convolution Models.
  • Galax does galactic and gravitational dynamics.
  • Geometric Kernels implements kernels on non-Euclidean spaces, such as Riemannian manifolds, graphs, and meshes.
  • LAB uses Plum to provide backend-agnostic linear algebra (something that works with PyTorch/TF/JAX/etc).
  • MLKernels implements standard kernels.
  • MMEval is a unified evaluation library for multiple machine learning libraries.
  • Matrix extends LAB and implements structured matrix types, such as low-rank matrices and Kronecker products.
  • NetKet, a library for machine learning with JAX/Flax targeted at quantum physics, uses Plum extensively to pick the right, efficient implementation for a large combination of objects that interact.
  • NeuralProcesses is a framework for composing Neural Processes.
  • OILMM is an implementation of the Orthogonal Linear Mixing Model.
  • PySAGES is a suite for advanced general ensemble simulations.
  • Quax implements multiple dispatch over abstract array types in JAX.
  • Unxt implements unitful quantities in JAX.
  • Varz uses Plum to provide backend-agnostic tools for non-linear optimisation.

See the docs for a comparison of Plum to other implementations of multiple dispatch.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

plum_dispatch-2.5.3.tar.gz (35.1 kB view details)

Uploaded Source

Built Distribution

plum_dispatch-2.5.3-py3-none-any.whl (42.3 kB view details)

Uploaded Python 3

File details

Details for the file plum_dispatch-2.5.3.tar.gz.

File metadata

  • Download URL: plum_dispatch-2.5.3.tar.gz
  • Upload date:
  • Size: 35.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.12.7

File hashes

Hashes for plum_dispatch-2.5.3.tar.gz
Algorithm Hash digest
SHA256 15684da4db1215405e22db9e0abba189ba0c4c663fc242d2c7c92cce2ad22fa9
MD5 188bbe79f225afa3b338c442009ae9dd
BLAKE2b-256 be0220cbd512d2076449ac093145fc569cd5d382e8d44f317526aa8c224150e9

See more details on using hashes here.

File details

Details for the file plum_dispatch-2.5.3-py3-none-any.whl.

File metadata

File hashes

Hashes for plum_dispatch-2.5.3-py3-none-any.whl
Algorithm Hash digest
SHA256 6d097b2931d7fe7688cba92f74732e70713ccc382343d526dc2aecb373900a79
MD5 7ad601006091def864971ab5bea2226f
BLAKE2b-256 5648253352df240f5f1d4226f757e4107344bc7f49a4f84ba7d1affb5916d622

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page