Persistent cache for aiohttp requests
See full documentation at https://aiohttp-client-cache.readthedocs.io
Not to be confused with aiohttp-cache, which is a cache for the aiohttp web server. This package is, as you might guess, specifically for the aiohttp client.
This is an early work in progress!
The current state is a mostly working drop-in replacement for
However, most cache operations are still synchronous, have had minimal testing, and likely have lots of bugs.
Breaking changes should be expected until a
Requires python 3.7+
Install the latest stable version with pip:
pip install aiohttp-client-cache
Note: You will need additional dependencies depending on which backend you want to use; See Cache Backends section below for details. To install with extra dependencies for all supported backends:
pip install aiohttp-client-cache[backends]
To set up for local development:
$ git clone https://github.com/JWCook/aiohttp-client-cache $ cd aiohttp-client-cache $ pip install -Ue ".[dev]" $ # Optional but recommended: $ pre-commit install --config .github/pre-commit.yml
See the examples folder for more detailed usage examples.
Here is a simple example using an endpoint that takes 1 second to fetch. After the first request, subsequent requests to the same URL will return near-instantly; so, fetching it 10 times will only take ~1 second instead of 10.
from aiohttp_client_cache import CachedSession, SQLiteBackend async with CachedSession(cache=SQLiteBackend()) as session: for i in range(10): await session.get('http://httpbin.org/delay/1')
aiohttp-client-cache can also be used as a mixin, if you happen have other mixin classes that you
want to combine with it:
from aiohttp import ClientSession from aiohttp_client_cache import CacheMixin class CustomSession(CacheMixin, CustomMixin, ClientSession): pass
Several backends are available. If one isn't specified, a simple in-memory cache will be used.
SQLiteBackend: Uses a SQLite database (requires aiosqlite)
DynamoDBBackend: Uses a Amazon DynamoDB database (requires boto3)
RedisBackend: Uses a Redis cache (requires redis-py)
MongoDBBackend: Uses a MongoDB database (requires pymongo)
You can also provide your own backend by subclassing
If you are using the
expire_after parameter, expired responses are removed from the storage the
next time the same request is made. If you want to manually purge all expired items, you can use
session = CachedSession(expire_after=3) # Cached responses expire after 3 hours await session.remove_expired_responses() # Remove any responses over 3 hours old
Caching behavior can be customized by defining various conditions:
- Response status codes
- Request HTTP methods
- Request headers
- Specific request parameters
- Custom filter function
See CacheBackend docs for details.
- Initial PyPI release
- First pass at a general refactor and conversion from
- Basic features are functional, but some backends do not actually operate asynchronously
requests-cache development history
for details on prior changes.
Release history Release notifications | RSS feed
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
|Filename, size||File type||Python version||Upload date||Hashes|
|Filename, size aiohttp_client_cache-0.1.3-py3-none-any.whl (23.0 kB)||File type Wheel||Python version py3||Upload date||Hashes View|
|Filename, size aiohttp-client-cache-0.1.3.tar.gz (20.7 kB)||File type Source||Python version None||Upload date||Hashes View|
Hashes for aiohttp_client_cache-0.1.3-py3-none-any.whl
Hashes for aiohttp-client-cache-0.1.3.tar.gz