Skip to main content

Execution and caching tool for python

Project description

Exca - ⚔

Execute and cache seamlessly in python.

workflow badge

Quick install

pip install exca

Full documentation

Documentation is available at https://facebookresearch.github.io/exca/

Basic overview

exca provides simple decorators to:

  • execute a (hierarchy of) computation(s) either locally or on distant nodes,
  • cache the result.

The problem:

In ML pipelines, the use of a simple python function, such as my_task:

import numpy as np

def my_task(param: int = 12) -> float:
    return param * np.random.rand()

often requires cumbersome overheads to (1) configure the parameters, (2) submit the job on a cluster, (3) cache the results: e.g.

import pickle
from pathlib import Path
import submitit

# Configure
param = 12

# Check task has already been executed
filepath = tmp_path / f'result-{param}.npy'
if not filepath.exists():

    # Submit job on cluster
    executor = submitit.AutoExecutor(cluster=None, folder=tmp_path)
    job = executor.submit(my_task, param)
    result = job.result()

    # Cache result
    with filepath.open("wb") as f:
        pickle.dump(result, f)

These overheads lead to several issues, such as debugging, handling hierarchical execution and properly saving the results (ending in the classic 'result-parm12-v2_final_FIX.npy').

The solution:

exca can be used to decorate the method of a pydantic model so as to seamlessly configure its execution and caching:

import numpy as np
import pydantic
import exca as xk

class MyTask(pydantic.BaseModel):
    param: int = 12
    infra: xk.TaskInfra = xk.TaskInfra()

    @infra.apply
    def process(self) -> float:
        return self.param * np.random.rand()


task = MyTask(param=1, infra={"folder": tmp_path, "cluster": "auto"})
out = task.process()  # runs on slurm if available
# calling process again will load the cache and not a new random number
assert out == task.process()

See the API reference for all the details

Quick comparison

feature \ tool lru_cache hydra submitit exca
RAM cache
file cache
remote compute
pure python (vs command line)
hierarchical config

Contributing

See the CONTRIBUTING file for how to help out.

Citing

@misc{exca,
    author = {J. Rapin and J.-R. King},
    title = {{Exca - Execution and caching}},
    year = {2024},
    publisher = {GitHub},
    journal = {GitHub repository},
    howpublished = {\url{https://github.com/facebookresearch/exca}},
}

License

exca is MIT licensed, as found in the LICENSE file. Also check-out Meta Open Source Terms of Use and Privacy Policy.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

exca-0.5.10.tar.gz (86.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

exca-0.5.10-py3-none-any.whl (103.5 kB view details)

Uploaded Python 3

File details

Details for the file exca-0.5.10.tar.gz.

File metadata

  • Download URL: exca-0.5.10.tar.gz
  • Upload date:
  • Size: 86.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.0.1 CPython/3.10.15

File hashes

Hashes for exca-0.5.10.tar.gz
Algorithm Hash digest
SHA256 14b5258199828b4b97eaf69de6d6ed144669f31a4cebb232ef53ca42b72c0256
MD5 7719014cb7f13822a911874ed08b5c6a
BLAKE2b-256 3d753f8f6d9cd4502756d6782490119e2c1b97a3c3b831b478493b105a5fe171

See more details on using hashes here.

File details

Details for the file exca-0.5.10-py3-none-any.whl.

File metadata

  • Download URL: exca-0.5.10-py3-none-any.whl
  • Upload date:
  • Size: 103.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.0.1 CPython/3.10.15

File hashes

Hashes for exca-0.5.10-py3-none-any.whl
Algorithm Hash digest
SHA256 9de77cfdbdf39f8d7f354c0b9308ff8d4653623214a24e6c3f12f5edbc7c67a8
MD5 3a7443a0c7190620e831024ffddf5768
BLAKE2b-256 03a6646831a0cc72fd81be56f0a6ae0c4bb93ae75c060c951c30bc801e12975a

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page