Skip to main content

a simple decorator to cache the results of computationally heavy functions

Project description

Travis CI build SonarCloud Quality SonarCloud Maintainability Codacy Maintainability Maintainability Pypi project Pypi total project downloads

A simple decorator to cache the results of computationally heavy functions. The package automatically serialize and deserialize depending on the format of the save path.

By default it supports only .json and .pkl but other extensions can be enabled by using the extra feature:

[compress_json] .json.gz .json.bz .json.lzma

[compress_pickle] .pkl.gz .pkl.bz .pkl.lzma .pkl.zip

[numpy] .npy .npz

[pandas] .csv .csv.gz .csv.bz2 .csv.zip .csv.xz

[excel] .xlsx

The extra feature [numba] enables the caching of numba objects.

How do I install this package?

As usual, just download it using pip:

pip install cache_decorator

To install all the extensions use:

pip install "cache_decorator[all]"

(the double quotes are optional in bash but required by zsh)

Optionally you can specify the single features you want:

pip install "cache_decorator[compress_json, compress_pickle, numpy, pandas, excel, numba]"

If the installation fails you can try to add --user at the end of the command as:

pip install "cache_decorator[compress_json, compress_pickle, numpy, pandas, excel, numba]" --user

Tests Coverage

Since some software handling coverages sometime get slightly different results, here’s three of them:

Coveralls Coverage SonarCloud Coverage Code Climate Coverate

Examples of Usage

To cache a function or a method you just have to decorate it with the cache decorator.

from time import sleep
from cache_decorator import Cache

@Cache()
def x(a, b):
    sleep(3)
    return a + b

class A:
    @Cache()
    def x(self, a, b):
        sleep(3)
        return a + b

Cache path

The default cache directory is ./cache but this can be setted by passing the cache_dir parameter to the decorator or by setting the environment variable CACHE_DIR. In the case both are setted, the parameter folder has precedence over the environment one.

from time import sleep
from cache_decorator import Cache

@Cache(cache_dir="/tmp")
def x(a):
    sleep(3)
    return a

The path format can be modified by passing the cache_path parameter. This string will be formatted with infos about the function, its parameters and, if it’s a method, the self attributes.

De default path is:

from time import sleep
from cache_decorator import Cache

@Cache(cache_path="{cache_dir}/{file_name}_{function_name}/{_hash}.pkl")
def x(a):
    sleep(3)
    return a

But can be modified giving cache a more significative name, for example we can add the value of a into the file name.

from time import sleep
from cache_decorator import Cache

@Cache(cache_path="{cache_dir}/{file_name}_{function_name}/{a}_{_hash}.pkl")
def x(a):
    sleep(3)
    return a

Depending on the extension of the file, different serialization and deserialization dispatcher will be called.

from time import sleep
from cache_decorator import Cache

@Cache(cache_path="/tmp/{_hash}.pkl.gz")
def x(a):
    sleep(3)
    return a

@Cache(cache_path="/tmp/{_hash}.json")
def x(a):
    sleep(3)
    return {"1":1,"2":2}

@Cache(cache_path="/tmp/{_hash}.npy")
def x(a):
    sleep(3)
    return np.array([1, 2, 3])

@Cache(cache_path="/tmp/{_hash}.npz")
def x(a):
    sleep(3)
    return np.array([1, 2, 3]), np.array([1, 2, 4])

Ignoring arguments when computing the hash

By default the cache is differentiate by the parameters passed to the function. One can specify which parameters should be ignored.

from time import sleep
from cache_decorator import Cache

@Cache(args_to_ignore=["verbose"])
def x(a, verbose=False):
    sleep(3)
    if verbose:
        print("HEY")
    return a

Multiple arguments can be specified as a list of strings with the name of the arguments to ignore.

from time import sleep
from cache_decorator import Cache

@Cache(args_to_ignore=["verbose", "multiprocessing"])
def x(a, verbose=False, multiprocessing=False):
    sleep(3)
    if verbose:
        print("HEY")
    return a

Cache validity

Cache also might have a validity duration.

from time import sleep
from cache_decorator import Cache

@Cache(
    cache_path="/tmp/{_hash}.pkl.gz",
    validity_duration="24d"
    )
def x(a):
    sleep(3)
    return a

In this example the cache will be valid for the next 24 days. and on the 25th day the cache will be rebuilt. The duration can be written as a time in seconds or as a string with unit. The units can be “s” seconds, “m” minutes, “h” hours, “d” days, “w” weeks.

Logging

Each time a new function is decorated with this decorator, a new logger is created. You can modify the default logger with log_level and log_format.

from time import sleep
from cache_decorator import Cache

@Cache(log_level="debug")
def x(a):
    sleep(3)
    return a

If the default format is not like you like it you can change it with:

from time import sleep
from cache_decorator import Cache

@Cache(log_format="%(asctime)-15s[%(levelname)s]: %(message)s")
def x(a):
    sleep(3)
    return a

More informations about the formatting can be found here https://docs.python.org/3/library/logging.html .

Moreover, the name of the default logger is:

logging.getLogger("cache." + function.__name__)

So we can get the reference to the logger and fully customize it:

import logging
from cache_decorator import Cache

@Cache()
def test_function(x):
    return 2 * x

# Get the logger
logger = logging.getLogger("cache.f")
logger.setLevel(logging.DEBUG)

# Make it log to a file
handler = logging.FileHandler("cache.log")
logger.addHandler(handler)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

cache_decorator-1.4.0.tar.gz (12.6 kB view hashes)

Uploaded Source

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page