Cache for FastAPI. An opinionated fork of fastapi-cache library that uses msgspec for json encoding/decoding where possible
Project description
fastapi-cache
Introduction
fastapi-cache
is a tool to cache FastAPI endpoint and function results, with
backends supporting Redis, Memcached, and Amazon DynamoDB.
Features
- Supports
redis
,memcache
,dynamodb
, andin-memory
backends. - Easy integration with FastAPI.
- Support for HTTP cache headers like
ETag
andCache-Control
, as well as conditionalIf-Match-None
requests.
Requirements
- FastAPI
redis
when usingRedisBackend
.memcache
when usingMemcacheBackend
.aiobotocore
when usingDynamoBackend
.
Install
> pip install fastapi-cache2
or
> pip install "fastapi-cache2[redis]"
or
> pip install "fastapi-cache2[memcache]"
or
> pip install "fastapi-cache2[dynamodb]"
Usage
Quick Start
from collections.abc import AsyncIterator
from contextlib import asynccontextmanager
from fastapi import FastAPI
from starlette.requests import Request
from starlette.responses import Response
from fastapi_cache import FastAPICache
from fastapi_cache.backends.redis import RedisBackend
from fastapi_cache.decorator import cache
from redis import asyncio as aioredis
@asynccontextmanager
async def lifespan(_: FastAPI) -> AsyncIterator[None]:
redis = aioredis.from_url("redis://localhost")
FastAPICache.init(RedisBackend(redis), prefix="fastapi-cache")
yield
app = FastAPI(lifespan=lifespan)
@cache()
async def get_cache():
return 1
@app.get("/")
@cache(expire=60)
async def index():
return dict(hello="world")
Initialization
First you must call FastAPICache.init
during startup FastAPI startup; this is where you set global configuration.
Use the @cache
decorator
If you want cache a FastAPI response transparently, you can use the @cache
decorator between the router decorator and the view function.
Parameter | type | default | description |
---|---|---|---|
expire |
int |
sets the caching time in seconds | |
namespace |
str |
"" |
namespace to use to store certain cache items |
coder |
Coder |
JsonCoder |
which coder to use, e.g. JsonCoder |
key_builder |
KeyBuilder callable |
default_key_builder |
which key builder to use |
injected_dependency_namespace |
str |
__fastapi_cache |
prefix for injected dependency keywords. |
cache_status_header |
str |
X-FastAPI-Cache |
Name for the header on the response indicating if the request was served from cache; either HIT or MISS . |
with_lock |
bool |
False | Whether to lock on cache get/set - may be useful to limit concurrent executions of heavy functions so that the first call could cache the function and subsequent calls will return the cached version |
lock_timeout |
int |
60 | Timeout used when lock is enabled - function will be executed after timeout expires or when lock is released |
bypass_cache_control |
bool |
False | Bypass "Cache-Control" headers from origin - may be useful to enforce caching |
You can also use the @cache
decorator on regular functions to cache their result.
Currently, lock is available only with inmemory
and redis
backends (other backends will just ignore it).
Injected Request and Response dependencies
The cache
decorator injects dependencies for the Request
and Response
objects, so that it can add cache control headers to the outgoing response, and
return a 304 Not Modified response when the incoming request has a matching
If-Non-Match
header. This only happens if the decorated endpoint doesn't already
list these dependencies already.
The keyword arguments for these extra dependencies are named
__fastapi_cache_request
and __fastapi_cache_response
to minimize collisions.
Use the injected_dependency_namespace
argument to @cache
to change the
prefix used if those names would clash anyway.
Supported data types
When using the (default) JsonCoder
, the cache can store any data type that FastAPI can convert to JSON, including Pydantic models and dataclasses,
provided that your endpoint has a correct return type annotation. An
annotation is not needed if the return type is a standard JSON-supported Python
type such as a dictionary or a list.
E.g. for an endpoint that returns a Pydantic model named SomeModel
, the return annotation is used to ensure that the cached result is converted back to the correct class:
from .models import SomeModel, create_some_model
@app.get("/foo")
@cache(expire=60)
async def foo() -> SomeModel:
return create_some_model()
It is not sufficient to configure a response model in the route decorator; the cache needs to know what the method itself returns. If no return type decorator is given, the primitive JSON type is returned instead.
For broader type support, use the fastapi_cache.coder.PickleCoder
or implement a custom coder (see below).
Custom coder
By default use JsonCoder
, you can write custom coder to encode and decode cache result, just need
inherit fastapi_cache.coder.Coder
.
from typing import Any
import orjson
from fastapi.encoders import jsonable_encoder
from fastapi_cache import Coder
class ORJsonCoder(Coder):
@classmethod
def encode(cls, value: Any) -> bytes:
return orjson.dumps(
value,
default=jsonable_encoder,
option=orjson.OPT_NON_STR_KEYS | orjson.OPT_SERIALIZE_NUMPY,
)
@classmethod
def decode(cls, value: bytes) -> Any:
return orjson.loads(value)
@app.get("/")
@cache(expire=60, coder=ORJsonCoder)
async def index():
return dict(hello="world")
Custom key builder
By default the default_key_builder
builtin key builder is used; this creates a
cache key from the function module and name, and the positional and keyword
arguments converted to their repr()
representations, encoded as a MD5 hash.
You can provide your own by passing a key builder in to @cache()
, or to
FastAPICache.init()
to apply globally.
For example, if you wanted to use the request method, URL and query string as a cache key instead of the function identifier you could use:
def request_key_builder(
func,
namespace: str = "",
*,
request: Request = None,
response: Response = None,
*args,
**kwargs,
):
return ":".join([
namespace,
request.method.lower(),
request.url.path,
repr(sorted(request.query_params.items()))
])
@app.get("/")
@cache(expire=60, key_builder=request_key_builder)
async def index():
return dict(hello="world")
Backend notes
InMemoryBackend
The InMemoryBackend
stores cache data in memory and only deletes when an
expired key is accessed. This means that if you don't access a function after
data has been cached, the data will not be removed automatically.
RedisBackend
When using the Redis backend, please make sure you pass in a redis client that does not decode responses (decode_responses
must be False
, which is the default). Cached data is stored as bytes
(binary), decoding these in the Redis client would break caching.
Tests and coverage
coverage run -m pytest
coverage html
xdg-open htmlcov/index.html
License
This project is licensed under the Apache-2.0 License.
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file fastapi_cache2_fork-0.6.0.tar.gz
.
File metadata
- Download URL: fastapi_cache2_fork-0.6.0.tar.gz
- Upload date:
- Size: 18.7 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/5.1.1 CPython/3.12.7
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 6e610b97687c6a29d1aca117a767b6b9158c7a736f4ce89b8b66b4a505ed1559 |
|
MD5 | bdf02b4c2f69b9a53af26d46b190a172 |
|
BLAKE2b-256 | 064b300bc2fec1c0c4839c50a93687fa22d787b668936d5e1646a266aa97f2d0 |
Provenance
The following attestation bundles were made for fastapi_cache2_fork-0.6.0.tar.gz
:
Publisher:
ci-cd.yml
on Yolley/fastapi-cache
-
Statement type:
https://in-toto.io/Statement/v1
- Predicate type:
https://docs.pypi.org/attestations/publish/v1
- Subject name:
fastapi_cache2_fork-0.6.0.tar.gz
- Subject digest:
6e610b97687c6a29d1aca117a767b6b9158c7a736f4ce89b8b66b4a505ed1559
- Sigstore transparency entry: 145425653
- Sigstore integration time:
- Predicate type:
File details
Details for the file fastapi_cache2_fork-0.6.0-py3-none-any.whl
.
File metadata
- Download URL: fastapi_cache2_fork-0.6.0-py3-none-any.whl
- Upload date:
- Size: 26.6 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/5.1.1 CPython/3.12.7
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 4aa04896f8ed4e0a9e6e830d66204bd60d0bcdf4670f40d436b76e4cc67963c6 |
|
MD5 | b8258df2268527ad2a3712a94cd17b0a |
|
BLAKE2b-256 | 521adec4dc5216d490f80348730f6c62e9cec5511a9f4be781a68d41f98306e2 |
Provenance
The following attestation bundles were made for fastapi_cache2_fork-0.6.0-py3-none-any.whl
:
Publisher:
ci-cd.yml
on Yolley/fastapi-cache
-
Statement type:
https://in-toto.io/Statement/v1
- Predicate type:
https://docs.pypi.org/attestations/publish/v1
- Subject name:
fastapi_cache2_fork-0.6.0-py3-none-any.whl
- Subject digest:
4aa04896f8ed4e0a9e6e830d66204bd60d0bcdf4670f40d436b76e4cc67963c6
- Sigstore transparency entry: 145425656
- Sigstore integration time:
- Predicate type: