Skip to main content

Execution and caching tool for python

Project description

Exca - ⚔

Execute and cache seamlessly in python.

workflow badge

Quick install

pip install exca

Full documentation

Documentation is available at https://facebookresearch.github.io/exca/

Basic overview

exca provides simple decorators to:

  • execute a (hierarchy of) computation(s) either locally or on distant nodes,
  • cache the result.

The problem:

In ML pipelines, the use of a simple python function, such as my_task:

import numpy as np

def my_task(param: int = 12) -> float:
    return param * np.random.rand()

often requires cumbersome overheads to (1) configure the parameters, (2) submit the job on a cluster, (3) cache the results: e.g.

import pickle
from pathlib import Path
import submitit

# Configure
param = 12

# Check task has already been executed
filepath = tmp_path / f'result-{param}.npy'
if not filepath.exists():

    # Submit job on cluster
    executor = submitit.AutoExecutor(cluster=None, folder=tmp_path)
    job = executor.submit(my_task, param)
    result = job.result()

    # Cache result
    with filepath.open("wb") as f:
        pickle.dump(result, f)

These overheads lead to several issues, such as debugging, handling hierarchical execution and properly saving the results (ending in the classic 'result-parm12-v2_final_FIX.npy').

The solution:

exca can be used to decorate the method of a pydantic model so as to seamlessly configure its execution and caching:

import numpy as np
import pydantic
import exca as xk

class MyTask(pydantic.BaseModel):
    param: int = 12
    infra: xk.TaskInfra = xk.TaskInfra()

    @infra.apply
    def process(self) -> float:
        return self.param * np.random.rand()


task = MyTask(param=1, infra={"folder": tmp_path, "cluster": "auto"})
out = task.process()  # runs on slurm if available
# calling process again will load the cache and not a new random number
assert out == task.process()

See the API reference for all the details

Quick comparison

feature \ tool lru_cache hydra submitit exca
RAM cache
file cache
remote compute
pure python (vs command line)
hierarchical config

Contributing

See the CONTRIBUTING file for how to help out.

Citing

@misc{exca,
    author = {J. Rapin and J.-R. King},
    title = {{Exca - Execution and caching}},
    year = {2024},
    publisher = {GitHub},
    journal = {GitHub repository},
    howpublished = {\url{https://github.com/facebookresearch/exca}},
}

License

exca is MIT licensed, as found in the LICENSE file. Also check-out Meta Open Source Terms of Use and Privacy Policy.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

exca-0.3.1.tar.gz (67.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

exca-0.3.1-py3-none-any.whl (81.9 kB view details)

Uploaded Python 3

File details

Details for the file exca-0.3.1.tar.gz.

File metadata

  • Download URL: exca-0.3.1.tar.gz
  • Upload date:
  • Size: 67.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.0.1 CPython/3.10.15

File hashes

Hashes for exca-0.3.1.tar.gz
Algorithm Hash digest
SHA256 4f15f4d7bd70d09422d6859191604fa655a01fad6fac28132ad44cf600c5bc50
MD5 03a36c6ee47257076fcc006a76d9e92b
BLAKE2b-256 55feaa387fde554f19b58ce5fea2f93563840466068a8d2da1f87ba311c93033

See more details on using hashes here.

File details

Details for the file exca-0.3.1-py3-none-any.whl.

File metadata

  • Download URL: exca-0.3.1-py3-none-any.whl
  • Upload date:
  • Size: 81.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.0.1 CPython/3.10.15

File hashes

Hashes for exca-0.3.1-py3-none-any.whl
Algorithm Hash digest
SHA256 2593b8ec2f6d341c934ee87d239ae169d91a4cf57704e15902cf71c40a2fb721
MD5 6a0ded9ac975313d4ee22dec4c6e5b65
BLAKE2b-256 0ac4a101d7ace0ac88ed3991629d68311360785edb4cf4af1794d52b0dc0b221

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page