High-performance caching solution for FastAPI applications
Project description
fastapi-cachekit
A high-performance, flexible caching solution for FastAPI applications. fastapi-cachekit supports both synchronous and asynchronous operations with a clean API and multiple backend options.
Features
- ✅ Full async/sync support for all operations
- ✅ Multiple backend Support So you can use the same tech stack as your app
- ✅ Function result caching with decorator syntax
- ✅ FastAPI dependency injection support
- ✅ Namespace support for isolating cache entries
- ✅ Customizable key generation
- ✅ Type hinting throughout the codebase
- ✅ Expiration time support (seconds or timedelta)
📦 Backends & Sync/Async Support
| Backend | Sync API | Async API | Install Extra |
|---|---|---|---|
InMemoryBackend |
✅ | ✅ | built-in |
RedisBackend |
✅ | ✅ | redis |
PostgresBackend |
✅ | ✅ | postgres |
MemcachedBackend |
✅ | ✅ | memcached |
MongoDB |
✅ | ✅ | mongodb |
FireStore |
✅ | ✅ | firestore |
DynamoDBBackend |
✅ | ✅ | dynamodb |
🛠️ Installation
Base (in-memory only):
pip install fastapi-cachekit
With Redis:
pip install fastapi-cachekit[redis]
With Postgres:
pip install fastapi-cachekit[postgres]
With Memcached:
pip install fastapi-cachekit[memcached]
With MongoDB:
pip install fastapi-cachekit[mongodb]
With FireStore:
pip install fastapi-cachekit[firestore]
With DynamoDB:
pip install fastapi-cachekit[dynamodb]
All backends:
pip install fastapi-cachekit[all]
Quick Start
from fastapi import FastAPI, Depends
from fast_cache import cache, RedisBackend
from typing import Annotated
app = FastAPI()
# Initialize cache with Redis backend
cache.init_app(
app=app,
backend=RedisBackend(redis_url="redis://localhost:6379/0", namespace="myapp"),
default_expire=300 # 5 minutes default expiration
)
# Use function caching decorator
@app.get("/items/{item_id}")
@cache.cached(expire=60) # Cache for 60 seconds
async def read_item(item_id: int):
# Expensive operation simulation
return {"item_id": item_id, "name": f"Item {item_id}"}
# Use cache backend directly with dependency injection
@app.get("/manual-cache")
async def manual_cache_example(cache_backend: Annotated[RedisBackend, Depends(cache.get_cache)]):
# Check if key exists
has_key = await cache_backend.ahas("my-key")
if not has_key:
# Set a value in the cache
await cache_backend.aset("my-key", {"data": "cached value"}, expire=30)
return {"cache_set": True}
# Get the value from cache
value = await cache_backend.aget("my-key")
return {"cached_value": value}
Now You Can use the cache from other Sub Routes by importing from
from fast_cache import cache
Detailed Usage
Initializing the Cache
Before using the cache, you need to initialize it with a backend:
from fastapi import FastAPI
from fast_cache import cache, RedisBackend
from datetime import timedelta
app = FastAPI()
cache.init_app(
app=app,
backend=RedisBackend(
redis_url="redis://localhost:6379/0",
namespace="myapp",
max_connections=20
),
default_expire=timedelta(minutes=5)
)
Method1: Caching a Function Result Using A Cache Decorator
The @cache.cached() decorator is the simplest way to cache function results:
from fast_cache import cache
# Cache with default expiration time
@cache.cached()
def get_user_data(user_id: int):
# Expensive database query
return {"user_id": user_id, "name": "John Doe"}
# Cache with custom namespace and expiration
@cache.cached(namespace="users", expire=300)
async def get_user_profile(user_id: int):
# Async expensive operation
return {"user_id": user_id, "profile": "..."}
# Cache with custom key builder
@cache.cached(key_builder=lambda user_id, **kwargs: f"user:{user_id}")
def get_user_permissions(user_id: int):
# Complex permission calculation
return ["read", "write"]
# Skip Cache for Specific Calls
Sometimes you need to bypass the cache for certain requests:
@cache.cached()
async def get_weather(city: str, skip_cache: bool = False):
# Function will be called directly if skip_cache is True
return await fetch_weather_data(city)
# Usage:
weather = await get_weather("New York", skip_cache=True) # Bypasses cache
Method 2: Using Via Dependency Injection
You can access the cache backend directly for more control:
from fastapi import Depends
from fast_cache import cache, CacheBackend
from typing import Annotated
@app.get("/api/data")
async def get_data(cache_backend: Annotated[CacheBackend, Depends(cache.get_cache)]):
# Try to get from cache
cached_data = await cache_backend.aget("api:data")
if cached_data:
return cached_data
# Generate new data
data = await fetch_expensive_api_data()
# Store in cache for 1 hour
await cache_backend.aset("api:data", data, expire=3600)
return data
Advanced: Implementing Custom Backends
You can create your own cache backend by implementing the CacheBackend abstract class:
from fast_cache.backends.backend import CacheBackend
from typing import Any, Optional, Union
from datetime import timedelta
class MyCustomBackend(CacheBackend):
# Implement all required methods
async def aget(self, key: str) -> Any:
# Your implementation here
...
def get(self, key: str) -> Any:
# Your implementation here
...
# ... implement all other required methods
API Reference
Cache Instance
cache.init_app(app, backend, default_expire=None)- Initialize cache with FastAPI appcache.get_cache()- Get cache backend instance (for dependency injection)cache.cached(expire=None, key_builder=None, namespace=None)- Caching decorator
CacheBackend Interface
All backends implement these methods in both sync and async versions:
get(key)/aget(key)- Retrieve a valueset(key, value, expire)/aset(key, value, expire)- Store a valuedelete(key)/adelete(key)- Delete a valueclear()/aclear()- Clear all valueshas(key)/ahas(key)- Check if key exists
RedisBackend Configuration
redis_url- Redis connection string (required)namespace- Key prefix (default: "fastapi-cache")pool_size- Minimum pool connections (default: 10)max_connections- Maximum pool connections (default: 20)
Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
License
This project is licensed under the MIT License - see the LICENSE file for details.
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file fastapi_cachekit-0.2.0.tar.gz.
File metadata
- Download URL: fastapi_cachekit-0.2.0.tar.gz
- Upload date:
- Size: 25.7 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: uv/0.10.12 {"installer":{"name":"uv","version":"0.10.12","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
475b640dba15743df981f5fe1ea99d55f4773d38840551170ce6ab800cab37b1
|
|
| MD5 |
ca84f21032becdfe7dfd0ae5cdee286c
|
|
| BLAKE2b-256 |
636e08c731eade33c52e74722452e4d927acd0485a96cf7adc9fe0832e084a8d
|
File details
Details for the file fastapi_cachekit-0.2.0-py3-none-any.whl.
File metadata
- Download URL: fastapi_cachekit-0.2.0-py3-none-any.whl
- Upload date:
- Size: 31.4 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: uv/0.10.12 {"installer":{"name":"uv","version":"0.10.12","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
5ca0a7ef125c20774bce282dce68956bd7e43ae0e7298a39916b3f5d263bffa3
|
|
| MD5 |
e9962085ef750af2502a09c18444d7ce
|
|
| BLAKE2b-256 |
c8442bc835033abd16610481ac00516a2554c4d3ebc15b67c6dd4c28acf8bdf3
|