Skip to main content

Conditional cache is a wrapper over functools.lru_cache that allows for conditionally caching based on the output of the function.

Project description

ConditionalCache

ConditionalCache

ConditionalCache is a set of decorators, that provide conditional function memoization and selective cache clearing.

It works under the same interface that most standard cache decorators like functools.lru_cache or cachetools.ttl_cache, but unlocking a new condition parameter, that will determine if the function result is memoized or not. This feature allows for more granular control over caching behavior, useful for those use cases where we want to store the output only when certain conditions are met. As for example when checking existence in databases.

Installation

To install ConditionalCache simply run:

pip install conditional-cache

Usage

Working with ConditionalCache is as simple and straight-forward as using functools.lru_cache, as it works under the same interface.

from conditional_cache import lru_cache

# Memoize the returned element only when it is different than "Not Found"
@lru_cache(maxsize=64, condition=lambda db_value: db_value != "Not Found")
def element_exists_in_db(element_id: int) -> str:

  print(f"Asked to DB: {element_id}")
  # For the example let's consider that even elements exists.
  return "Found" if element_id % 2 == 0 else "Not Found"

When we will call this function, it will be execute only once for even numbers, and always for odds.

# Will be executed, and not memoized
print(f"Returned: {element_exists_in_db(element_id=1)}")
# Will be executed again
print(f"Returned: {element_exists_in_db(element_id=1)}\n")

# Will be executed and memoized
print(f"Returned: {element_exists_in_db(element_id=2)}")
# Will return the memoized result without executing again
print(f"Returned: {element_exists_in_db(element_id=2)}")
>> Asked to DB: 1
>> Returned: Not Found
>> Asked to DB: 1
>> Returned: Not Found

>> Asked to DB: 2
>> Returned: Found
>> Returned: Found

If during your execution, you perform an action that invalidate a given function result, you can actively remove that element cache:

# Will return the result that was memoized before
print(f"Returned: {element_exists_in_db(element_id=2)}\n")
# Remove the element from the cache
element_exists_in_db.cache_remove(element_id=2)

# Will be executed again and memoized
print(f"Returned: {element_exists_in_db(element_id=2)}")
# Will return the memoized result
print(f"Returned: {element_exists_in_db(element_id=2)}")
>> Returned: Found

>> Asked to DB: 2
>> Returned: Found
>> Returned: Found

Controlling cache size by memory

In addition to maxsize (number of elements), you can also limit the cache by memory usage with maxsize_bytes.

from conditional_cache import lru_cache

@lru_cache(maxsize_bytes=1024)  # keep up to ~1 KB of cached data
def heavy_query(x: int) -> str:
    print("Executed:", x)
    return "X" * (x * 100)

heavy_query(1)   # Cached
heavy_query(10)  # May evict older entries if too large

This way you can avoid overflowing your memory if you need to cache large objects like images. If a single result is too large to ever fit in the cache, it will just not be stored.

Time-based expiration (TTL)

Use ttl_cache when you want cached entries to automatically expire after a given number of seconds.

import time
from conditional_cache import ttl_cache

@ttl_cache(ttl=3, maxsize=64, condition=lambda r: r is not None)
def fetch_user(user_id: int) -> dict | None:
    print("Fetching:", user_id)
    return {"id": user_id}

fetch_user(1)      # Executed and cached
time.sleep(1)
fetch_user(1)      # Retrieved from cache
time.sleep(3)
fetch_user(1)      # Expired -> executed again

Unhashable arguments

Unlike functools, ConditionalCache supports common unhashable types like list, dict, or set as arguments. They are transparently converted to hashable equivalents to avoid headaches.

from conditional_cache import lru_cache

@lru_cache(maxsize=32)
def stringify(a: list, b: dict) -> str:
    print("Executed:", a, b)
    return str(a) + str(b)

print(stringify([1,2,3], {"x": 42}))
print(stringify([1,2,3], {"x": 42}))  # retrieved from cache

API Reference

conditional_cache.lru_cache(maxsize: int = 128, maxsize_bytes: int | None = None, typed: bool = False, condition: callable = lambda x: True)

An Least Recently Used Cache. It works the same way that functools.lru_cache but accepting conditional storage and selective item removing through <decorated_function>.cache_remove(**args)

  • maxsize: int. The maximum amount of elements to keep cached. Once the cache is full, new elements will start to override oldest ones.
  • maxsize_bytes: int | None. The maximum amount of memory (in bytes, as estimated by sys.getsizeof) to keep cached. Useful when caching large objects. If a single item is larger than this budget, it will simply not be cached.
  • typed: bool. Works the same way that functools.lru_cache. If True, function arguments of different types will be cached separately.
  • condition: callable. It must be a function that receives a single parameter as input (the output of the decorated method) and returns a boolean. True if the result should be cached or False if it should not.

conditional_cache.ttl_cache(maxsize: int = 128, maxsize_bytes: int | None = None, typed: bool = False, ttl: int = 60, condition: callable = lambda x: True)

A Time-To-Live cache. Behaves like lru_cache with the same conditional storage and selective removal features, but cached entries automatically expire after ttl seconds.

  • maxsize: int. Maximum number of elements to keep cached.
  • maxsize_bytes: int | None. Maximum memory budget for cached data. Items larger than this budget are not cached.
  • typed: bool. If True, function arguments of different types are cached separately.
  • ttl: int. Time-to-live in seconds for each cached entry.
  • condition: callable. Receives the function output and returns True if it should be cached.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

conditional_cache-1.4.tar.gz (5.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

conditional_cache-1.4-py3-none-any.whl (6.1 kB view details)

Uploaded Python 3

File details

Details for the file conditional_cache-1.4.tar.gz.

File metadata

  • Download URL: conditional_cache-1.4.tar.gz
  • Upload date:
  • Size: 5.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.12.9

File hashes

Hashes for conditional_cache-1.4.tar.gz
Algorithm Hash digest
SHA256 f87be16cd342e32656f630a67868b7b1de674740b8159dc1159c854a97aac2e2
MD5 ad1a15ecd98be021095087073f002e53
BLAKE2b-256 c82e69a517945a81a16bff83e99d2a9ac7aa5eb82f6d75f55dfd56bcf001eeae

See more details on using hashes here.

File details

Details for the file conditional_cache-1.4-py3-none-any.whl.

File metadata

File hashes

Hashes for conditional_cache-1.4-py3-none-any.whl
Algorithm Hash digest
SHA256 61a53d31372f0dc749bccae7e8694ad60200ca34b8c6ff415e9ca83780aec4b5
MD5 3da42d88faa88154f15345b7e68ae5c6
BLAKE2b-256 aad08f6af91e817689db88723301b8b59ffa82e099779472e13b0957e5d58419

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page