Universal caching library with sync/async support, tagging, singleflight and distributed locks
Project description
RelayCache
Universal caching library for Python with sync/async support, tagging, singleflight pattern and distributed locks.
Features
- Universal API: Works with both sync and async code
- Multiple backends: In-memory, Redis, async Redis
- Cache tagging: Group and invalidate related cache entries
- Singleflight: Prevent thundering herd with automatic deduplication
- Distributed locks: Cross-process coordination via Redis
- TTL support: Automatic expiration of cache entries
- Type hints: Full typing support
Installation
pip install relaycache
For development:
pip install relaycache[dev]
Quick Start
Basic Usage
from custom_cache import cache, InMemoryCache
# Use with default in-memory backend
@cache(ttl=300) # Cache for 5 minutes
def expensive_function(x, y):
# Some expensive computation
return x * y + 42
result = expensive_function(10, 20) # Computed
result = expensive_function(10, 20) # From cache
Redis Backend
import redis
from custom_cache import cache, RedisCache
# Setup Redis backend
redis_client = redis.Redis(host='localhost', port=6379, db=0)
redis_backend = RedisCache(redis_client, default_ttl=3600)
@cache(ttl=1800, backend=redis_backend, tags=["users"])
def get_user_profile(user_id):
# Fetch from database
return {"id": user_id, "name": "John", "email": "john@example.com"}
# Usage
profile = get_user_profile(123)
Async Support
import asyncio
from redis.asyncio import Redis
from custom_cache import cache, AioredisCache
# Setup async Redis backend
async def main():
redis_client = Redis(host='localhost', port=6379, db=0)
async_backend = AioredisCache(redis_client, default_ttl=3600)
@cache(ttl=1800, backend=async_backend, tags=["posts"])
async def get_post(post_id):
# Async database call
await asyncio.sleep(0.1)
return {"id": post_id, "title": "Sample Post"}
post = await get_post(456) # Computed
post = await get_post(456) # From cache
asyncio.run(main())
Advanced Features
Cache Tagging and Invalidation
from custom_cache import cache, invalidate
@cache(ttl=3600, tags=lambda user_id: [f"user:{user_id}", "users"])
def get_user_data(user_id):
return fetch_user_from_db(user_id)
# Invalidate specific user
invalidate(tags=[f"user:{user_id}"])
# Invalidate all users
invalidate(tags=["users"])
Distributed Singleflight
Prevent multiple processes from computing the same value simultaneously:
@cache(
ttl=1800,
backend=redis_backend,
distributed_singleflight=True, # Enable distributed coordination
dist_lock_ttl=5.0, # Lock TTL in seconds
dist_lock_timeout=2.0 # Lock acquisition timeout
)
def expensive_computation(key):
# Only one process will execute this at a time per key
time.sleep(10) # Simulate expensive work
return f"result_for_{key}"
Custom Key Building
from custom_cache import cache, KeyBuilder
# Custom key builder
kb = KeyBuilder(prefix="myapp", namespace="v1")
@cache(ttl=3600, key_builder=kb)
def my_function(arg1, arg2):
return arg1 + arg2
# Or custom key function
@cache(ttl=3600, key=lambda x, y: f"sum:{x}:{y}")
def sum_function(x, y):
return x + y
Manual Cache Management
from custom_cache import InMemoryCache
cache_backend = InMemoryCache(default_ttl=3600)
# Manual operations
cache_backend.set("key1", "value1", ttl=1800, tags=["group1"])
hit, value = cache_backend.get("key1")
if hit:
print(f"Found: {value}")
# Delete specific key
cache_backend.delete("key1")
# Clear all cache
cache_backend.clear()
# Invalidate by tags
cache_backend.invalidate_tags(["group1"])
Backends
InMemoryCache
Fast in-process cache with thread safety:
from custom_cache import InMemoryCache
backend = InMemoryCache(default_ttl=3600)
# Features:
# - Thread-safe operations
# - Automatic TTL expiration
# - Tag support
# - Memory efficient
RedisCache (Sync)
Redis-based cache for distributed applications:
import redis
from custom_cache import RedisCache
redis_client = redis.Redis(host='localhost', port=6379, db=0)
backend = RedisCache(
redis_client,
default_ttl=3600,
value_prefix="myapp:",
meta_prefix="myapp:meta"
)
# Features:
# - Distributed caching
# - Persistent storage
# - Tag-based invalidation
# - Distributed locks
AioredisCache (Async)
Async Redis cache for high-performance async applications:
from redis.asyncio import Redis
from custom_cache import AioredisCache
redis_client = Redis(host='localhost', port=6379, db=0)
backend = AioredisCache(
redis_client,
default_ttl=3600,
value_prefix="myapp:",
meta_prefix="myapp:meta"
)
# Features:
# - Non-blocking operations
# - High concurrency
# - Async/await support
# - All Redis features
Error Handling
from custom_cache import cache
from redis.exceptions import RedisError
@cache(ttl=1800, backend=redis_backend)
def robust_function(x):
# Cache failures won't break your app
return expensive_computation(x)
try:
result = robust_function(42)
except RedisError:
# Redis is down, function still works
result = expensive_computation(42)
Django Integration
# settings.py
import redis
from custom_cache import RedisCache
REDIS_CLIENT = redis.Redis(host='localhost', port=6379, db=0)
CACHE_BACKEND = RedisCache(REDIS_CLIENT, default_ttl=3600)
# views.py
from django.conf import settings
from custom_cache import cache
@cache(backend=settings.CACHE_BACKEND, ttl=1800, tags=["articles"])
def get_article_list():
return list(Article.objects.all().values())
# Invalidate on model changes
from django.db.models.signals import post_save
from custom_cache.utils import invalidate
@receiver(post_save, sender=Article)
def invalidate_articles(sender, **kwargs):
invalidate(tags=["articles"], backend=settings.CACHE_BACKEND)
Performance Tips
- Choose the right backend: InMemory for single-process, Redis for distributed
- Use appropriate TTL: Balance between freshness and performance
- Tag strategically: Group related data for efficient invalidation
- Enable singleflight: For expensive computations with high concurrency
- Monitor cache hit rates: Use backend statistics methods
API Reference
@cache decorator
@cache(
ttl: float, # Cache TTL in seconds
key: Optional[Callable] = None, # Custom key function
namespace: Optional[str] = None, # Key namespace
backend: Optional[Backend] = None, # Cache backend
key_builder: Optional[KeyBuilder] = None, # Custom key builder
tags: Optional[Union[List, Callable]] = None, # Cache tags
distributed_singleflight: bool = False, # Enable distributed locks
dist_lock_ttl: float = 5.0, # Lock TTL
dist_lock_timeout: float = 2.0 # Lock timeout
)
Backend Methods
All backends implement:
get(key) -> (hit: bool, value: Any)set(key, value, ttl, *, tags=None)delete(key)clear()invalidate_tags(tags)
Async backends also provide:
aget(key),aset(...),adelete(key),aclear(),ainvalidate_tags(...)
Requirements
- Python 3.8+
- redis-py 4.0+
License
MIT License. See LICENSE file for details.
Contributing
- Fork the repository
- Create a feature branch
- Add tests for new functionality
- Run the test suite:
pytest - Submit a pull request
Changelog
0.1.0
- Initial release
- Sync/async cache support
- Redis and in-memory backends
- Cache tagging
- Singleflight pattern
- Distributed locks
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file relaycache-0.1.0.tar.gz.
File metadata
- Download URL: relaycache-0.1.0.tar.gz
- Upload date:
- Size: 17.9 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.2
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
53dff0345a916ba70a9ba3691abed0d72a0177478bf28ffc22c91badb0fccffd
|
|
| MD5 |
dcc28152f94b860d2e268f594710cca9
|
|
| BLAKE2b-256 |
297d8e3e7cad955c1e29becf8f09a3edef37ff397844a7e2a7574fd03b06d288
|
File details
Details for the file relaycache-0.1.0-py3-none-any.whl.
File metadata
- Download URL: relaycache-0.1.0-py3-none-any.whl
- Upload date:
- Size: 16.6 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.2
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
05b6a44c2e00209b0bfe63634c9e810e494d8ee454813d391be74f5182413c85
|
|
| MD5 |
66470f7ccb22058e8803018e0d4f199a
|
|
| BLAKE2b-256 |
69f2fdc1bca26dcc34dd71e9d608e6e88093eb7565484fe8e950e00c69b75fce
|