Skip to main content

Production-ready composable caching with TTL, SWR, and background refresh patterns for Python.

Project description

advanced-caching

PyPI version Python 3.10+ License: MIT

Production-ready caching library for Python with TTL, stale-while-revalidate (SWR), and background refresh.
Type-safe, fast, thread-safe, async-friendly, and framework-agnostic.

Issues & feature requests: new issue


Table of Contents


Installation

uv pip install advanced-caching            # core
uv pip install "advanced-caching[redis]"  # Redis support
# pip works too

Quick Start

from advanced_caching import TTLCache, SWRCache, BGCache

# Sync function
@TTLCache.cached("user:{}", ttl=300)
def get_user(user_id: int) -> dict:
    return db.fetch(user_id)

# Async function (works natively)
@TTLCache.cached("user:{}", ttl=300)
async def get_user_async(user_id: int) -> dict:
    return await db.fetch(user_id)

# Stale-While-Revalidate (Sync)
@SWRCache.cached("product:{}", ttl=60, stale_ttl=30)
def get_product(product_id: int) -> dict:
    return api.fetch_product(product_id)

# Stale-While-Revalidate (Async)
@SWRCache.cached("async:product:{}", ttl=60, stale_ttl=30)
async def get_product_async(product_id: int) -> dict:
    return await api.fetch_product(product_id)

# Background refresh (Sync)
@BGCache.register_loader("inventory", interval_seconds=300)
def load_inventory() -> list[dict]:
    return warehouse_api.get_all_items()

# Background refresh (Async)
@BGCache.register_loader("inventory_async", interval_seconds=300)
async def load_inventory_async() -> list[dict]:
    return await warehouse_api.get_all_items()

# Configured Cache (Reusable Backend)
# Create a decorator pre-wired with a specific cache (e.g., Redis)
RedisTTL = TTLCache.configure(cache=RedisCache(redis_client))

@RedisTTL.cached("user:{}", ttl=300)
async def get_user_redis(user_id: int):
    return await db.fetch(user_id)

Key Templates

The library supports smart key generation that handles both positional and keyword arguments seamlessly.

  • Positional Placeholder: "user:{}"

    • Uses the first argument, whether passed positionally or as a keyword.
    • Example: get_user(123) or get_user(user_id=123) -> "user:123"
  • Named Placeholder: "user:{user_id}"

    • Resolves user_id from keyword arguments OR positional arguments (by inspecting the function signature).
    • Example: def get_user(user_id): ... called as get_user(123) -> "user:123"
  • Custom Function:

    • For complex logic, pass a callable.
    • Example1 for kw/args with default values use : key=lambda *a, **k: f"user:{k.get('user_id', a[0])}"
    • Example2 fns with no defaults use : key=lambda user_id: f"user:{user_id}"

Storage Backends

InMemCache

Thread-safe in-memory cache with TTL.

from advanced_caching import InMemCache

cache = InMemCache()
cache.set("key", "value", ttl=60)
cache.get("key")
cache.delete("key")
cache.exists("key")
cache.set_if_not_exists("key", "value", ttl=60)
cache.cleanup_expired()

RedisCache & Serializers

import redis
from advanced_caching import RedisCache, JsonSerializer

client = redis.Redis(host="localhost", port=6379)

cache = RedisCache(client, prefix="app:")
json_cache = RedisCache(client, prefix="app:json:", serializer="json")
custom_json = RedisCache(client, prefix="app:json2:", serializer=JsonSerializer())

Custom Serializer Example (msgpack)

import msgpack

class MsgpackSerializer:
    handles_entries = False

    @staticmethod
    def dumps(obj):
        return msgpack.packb(obj, use_bin_type=True)

    @staticmethod
    def loads(data):
        return msgpack.unpackb(data, raw=False)

HybridCache (L1 + L2)

Two-level cache:

  • L1: In-memory
  • L2: Redis

Simple setup

import redis
from advanced_caching import HybridCache, TTLCache

client = redis.Redis()
hybrid = HybridCache.from_redis(client, prefix="app:", l1_ttl=60)

@TTLCache.cached("user:{}", ttl=300, cache=hybrid)
def get_user(user_id: int):
    return {"id": user_id}

Manual wiring

from advanced_caching import HybridCache, InMemCache, RedisCache

l1 = InMemCache()
l2 = RedisCache(client, prefix="app:")
# l2_ttl defaults to l1_ttl * 2 if not specified
hybrid = HybridCache(l1_cache=l1, l2_cache=l2, l1_ttl=60)

# Explicit l2_ttl for longer L2 persistence
hybrid_long_l2 = HybridCache(l1_cache=l1, l2_cache=l2, l1_ttl=60, l2_ttl=3600)

TTL behavior:

  • l1_ttl: How long data stays in fast L1 memory cache
  • l2_ttl: How long data persists in L2 (Redis). Defaults to l1_ttl * 2
  • When data expires from L1 but exists in L2, it's automatically repopulated to L1

With BGCache using lambda factory

For lazy initialization (e.g., deferred Redis connection):

from advanced_caching import BGCache, HybridCache, InMemCache, RedisCache

def get_redis_cache():
    """Lazy Redis connection factory."""
    import redis
    client = redis.Redis(host="localhost", port=6379)
    return RedisCache(client, prefix="app:")

@BGCache.register_loader(
    "config_map",
    interval_seconds=3600,
    run_immediately=True,
    cache=lambda: HybridCache(
        l1_cache=InMemCache(),
        l2_cache=get_redis_cache(),
        l1_ttl=3600,
        l2_ttl=86400  # L2 persists longer than L1
    )
)
def load_config_map() -> dict[str, dict]:
    return {"db": {"host": "localhost"}, "cache": {"ttl": 300}}

# Access nested data
db_host = load_config_map().get("db", {}).get("host")

Advanced Configuration

To avoid repeating complex cache configurations (like HybridCache setup) in every decorator, you can create a pre-configured cache instance.

from advanced_caching import SWRCache, HybridCache, InMemCache, RedisCache

# 1. Define your cache factory
def create_hybrid_cache():
    return HybridCache(
        l1_cache=InMemCache(),
        l2_cache=RedisCache(redis_client),
        l1_ttl=300,
        l2_ttl=3600
    )

# 2. Create a configured decorator
MySWRCache = SWRCache.configure(cache=create_hybrid_cache)

# 3. Use it cleanly
@MySWRCache.cached("users:{}", ttl=300)
def get_users(code: str):
    return db.get_users(code)

This works for TTLCache, SWRCache, and BGCache.


Custom Storage

Implement the CacheStorage protocol.

File-based example

import json, time
from pathlib import Path
from advanced_caching import CacheEntry, CacheStorage, TTLCache, validate_cache_storage

class FileCache(CacheStorage):
    def __init__(self, directory="/tmp/cache"):
        self.dir = Path(directory)
        self.dir.mkdir(parents=True, exist_ok=True)

    def _path(self, key: str) -> Path:
        return self.dir / f"{key.replace(':','_')}.json"

    def get_entry(self, key):
        p = self._path(key)
        if not p.exists():
            return None
        data = json.loads(p.read_text())
        return CacheEntry(**data)

    def set_entry(self, key, entry, ttl=None):
        self._path(key).write_text(json.dumps(entry.__dict__))

    def get(self, key):
        e = self.get_entry(key)
        return e.value if e and e.is_fresh() else None

    def set(self, key, value, ttl=0):
        now = time.time()
        self.set_entry(key, CacheEntry(value, now + ttl, now))

    def delete(self, key):
        self._path(key).unlink(missing_ok=True)

    def exists(self, key):
        return self.get(key) is not None

    def set_if_not_exists(self, key, value, ttl):
        if self.exists(key):
            return False
        self.set(key, value, ttl)
        return True

cache = FileCache()
assert validate_cache_storage(cache)

API Reference

  • TTLCache.cached(key, ttl, cache=None)

  • SWRCache.cached(key, ttl, stale_ttl=0, cache=None)

  • BGCache.register_loader(key, interval_seconds, ttl=None, run_immediately=True)

  • Storages:

    • InMemCache()
    • RedisCache(redis_client, prefix="", serializer="pickle"|"json"|custom)
    • HybridCache(l1_cache, l2_cache, l1_ttl=60, l2_ttl=None) - l2_ttl defaults to l1_ttl * 2
  • Utilities:

    • CacheEntry
    • CacheStorage
    • validate_cache_storage()

Testing & Benchmarks

uv run pytest -q
uv run python tests/benchmark.py

Use Cases

  • Web & API caching (FastAPI, Flask, Django)
  • Database query caching
  • SWR for upstream APIs
  • Background refresh for configs & datasets
  • Distributed caching with Redis
  • Hybrid L1/L2 hot-path optimization

Comparison

Feature advanced-caching lru_cache cachetools Redis Memcached
TTL
SWR Manual Manual
Background refresh Manual Manual
Custom backends N/A N/A
Distributed
Async support
Type hints

Contributing

  1. Fork the repo
  2. Create a feature branch
  3. Add tests
  4. Run uv run pytest
  5. Open a pull request

License

MIT License – see LICENSE.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

advanced_caching-0.2.1.tar.gz (116.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

advanced_caching-0.2.1-py3-none-any.whl (17.1 kB view details)

Uploaded Python 3

File details

Details for the file advanced_caching-0.2.1.tar.gz.

File metadata

  • Download URL: advanced_caching-0.2.1.tar.gz
  • Upload date:
  • Size: 116.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for advanced_caching-0.2.1.tar.gz
Algorithm Hash digest
SHA256 549d0be1a81cd39e70cbee380fc1b2828bc6f736f351a9101feccaf1298a4dae
MD5 cc5c6e95978ea3a3f4dd285cc9059df9
BLAKE2b-256 1d10507a0065ff67ffc5589f17beed855bb8f216329fd93dc50187af083b3469

See more details on using hashes here.

Provenance

The following attestation bundles were made for advanced_caching-0.2.1.tar.gz:

Publisher: publish.yml on agkloop/advanced_caching

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file advanced_caching-0.2.1-py3-none-any.whl.

File metadata

File hashes

Hashes for advanced_caching-0.2.1-py3-none-any.whl
Algorithm Hash digest
SHA256 7ea461905ed644c8003f7ee167157d0558fe539d8c828c1a9e1408f86bdbd320
MD5 bc38d728a50ef3191d7d6ce93f39d844
BLAKE2b-256 f26108b585ee0e9407bf367ea0fe18c36e935daa858d0567de20daee4a6c6ee0

See more details on using hashes here.

Provenance

The following attestation bundles were made for advanced_caching-0.2.1-py3-none-any.whl:

Publisher: publish.yml on agkloop/advanced_caching

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page