A flexible and user-friendly Python caching library that provides a unified interface for both synchronous and asynchronous caching, with support for various backends.
Project description
cacheio
A flexible and user-friendly Python caching library that provides a unified interface for both synchronous and asynchronous caching, with support for various backends.
Overview 🚀
cacheio is designed to simplify caching in Python applications. It provides a simple, consistent API for interacting with different caching backends, whether your code is synchronous or asynchronous. The library intelligently loads dependencies based on your needs, so you only install what you use.
Installation
You can install cacheio with pip. The library uses optional dependency groups to manage its backends.
Basic Installation
To install the core library without any caching backends, run:
pip install cacheio
Installing with Backends
To install the library with specific backends, use the optional dependency groups:
-
Synchronous Caching: Use the
syncgroup forcachelib-based backends.pip install "cacheio[sync]"
-
Asynchronous Caching: Use the
asyncgroup foraiocache-based backends.pip install "cacheio[async]"
-
Full Installation: Use the
fullgroup to install both synchronous and asynchronous backends.pip install "cacheio[full]"
Quick Start
Synchronous Caching
Use CacheFactory to get a synchronous cache adapter. If cachelib is installed, this will provide an Adapter instance.
from cacheio import CacheFactory
# Get a simple in-memory cache adapter
my_cache = CacheFactory.memory_cache()
# Use the cache
my_cache.set("my_key", "my_value", ttl=300)
value = my_cache.get("my_key")
print(f"Retrieved value: {value}")
Asynchronous Caching
Use CacheFactory to get a clean asynchronous adapter. If aiocache is installed, this will provide an AsyncAdapter instance.
import asyncio
from cacheio import CacheFactory
async def main():
# Get an asynchronous cache adapter
my_async_cache = CacheFactory.async_memory_cache()
# Use the cache asynchronously
await my_async_cache.set("my_async_key", "my_async_value", ttl=300)
async_value = await my_async_cache.get("my_async_key")
print(f"Retrieved async value: {async_value}")
if __name__ == "__main__":
# Ensure you have a running event loop
asyncio.run(main())
Usage Examples
1. Synchronous Caching Example
This example demonstrates how to use the cached decorator for a synchronous method. We define a class that inherits from Cacheable, which automatically sets up a cachelib-based in-memory cache.
- Key Function (
key_fn): Thekey_fnis a crucial part of the decorator. For simple key generation, alambdais a clean and efficient way to define it inline. - Decorator: The
@cacheddecorator handles the rest, checking the cache for the key, calling thefetch_usermethod if the key isn't found, and storing the result.
import time
from cacheio import cached
from cacheio.mixins import Cacheable
# Define the class that uses caching.
# It inherits from `Cacheable` to get a default in-memory cache.
class UserService(Cacheable):
# The cached decorator uses a lambda to generate a unique cache key.
@cached(key_fn=lambda self, user_id, **kwargs: f"user:{user_id}", ttl=60)
def fetch_user(self, user_id: int, request_id: str) -> dict:
"""Simulates a slow, expensive database call."""
print(f"Fetching user {user_id} from database...")
time.sleep(2) # Simulate a 2-second network delay
return {"id": user_id, "name": f"User_{user_id}", "request": request_id}
# --- Usage ---
user_service = UserService()
# First call: The method runs and its result is cached.
print("First call:")
user_1 = user_service.fetch_user(user_id=1, request_id="req-1")
print(f"Result: {user_1}\n")
# Second call (with same arguments): The cached result is returned instantly.
print("Second call (should be instant):")
user_2 = user_service.fetch_user(user_id=1, request_id="req-1")
print(f"Result: {user_2}\n")
# Third call (with different arguments): The cached result is still returned because the key only depends on `user_id`.
print("Third call (with different request_id, should still be instant):")
user_3 = user_service.fetch_user(user_id=1, request_id="req-2")
print(f"Result: {user_3}")
2. Asynchronous Caching Example
This example mirrors the synchronous one but uses the async_cached decorator and a class that inherits from AsyncCacheable, which automatically sets up an aiocache-based in-memory cache. The core logic remains the same, but the functions and decorators are all async.
- Key Function (
key_fn): The key generation logic is now a conciselambdafunction. - Decorator: The
@async_cacheddecorator works just like its synchronous counterpart, but it's designed to work withawaitablefunctions and asynchronous cache adapters.
import asyncio
from cacheio import async_cached
from cacheio.mixins import AsyncCacheable
# Define the class that uses asynchronous caching.
# It inherits from `AsyncCacheable` for a default in-memory async cache.
class AsyncUserService(AsyncCacheable):
# The async_cached decorator uses a lambda to generate a unique cache key.
@async_cached(key_fn=lambda self, user_id, **kwargs: f"user:{user_id}", ttl=60)
async def fetch_user(self, user_id: int, request_id: str) -> dict:
"""Simulates a slow, expensive asynchronous database call."""
print(f"Fetching user {user_id} from database asynchronously...")
await asyncio.sleep(2) # Simulate a 2-second async delay
return {"id": user_id, "name": f"User_{user_id}", "request": request_id}
# --- Usage ---
async def main():
user_service = AsyncUserService()
# First call: The method runs and its result is cached.
print("First call:")
user_1 = await user_service.fetch_user(user_id=1, request_id="req-1")
print(f"Result: {user_1}\n")
# Second call (with same arguments): The cached result is returned instantly.
print("Second call (should be instant):")
user_2 = await user_service.fetch_user(user_id=1, request_id="req-1")
print(f"Result: {user_2}\n")
# Third call (with different arguments): The cached result is still returned.
print("Third call (with different request_id, should still be instant):")
user_3 = await user_service.fetch_user(user_id=1, request_id="req-2")
print(f"Result: {user_3}")
if __name__ == "__main__":
asyncio.run(main())
Contributing
We welcome contributions! Please feel free to open an issue or submit a pull request on our GitHub repository.
License
cacheio is distributed under the terms of the MIT license. See the LICENSE file for details.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file cacheio-0.1.0.tar.gz.
File metadata
- Download URL: cacheio-0.1.0.tar.gz
- Upload date:
- Size: 10.4 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.8.2 CPython/3.10.18 Linux/6.11.0-1018-azure
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
f1e6afaf2408e0c6d1b78c6f7837e9226a877cce4b5c7a9f33b2fdc0d42fea94
|
|
| MD5 |
a1c54f67ca7957c019d924b95e162e42
|
|
| BLAKE2b-256 |
e78dfbc0cc5900f7845b569a8dd225f82ff4aad5464e3ef9dbe15e38b81d8f43
|
File details
Details for the file cacheio-0.1.0-py3-none-any.whl.
File metadata
- Download URL: cacheio-0.1.0-py3-none-any.whl
- Upload date:
- Size: 14.1 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.8.2 CPython/3.10.18 Linux/6.11.0-1018-azure
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
45e05b61ed9e77a3bb15f7662c3565551ae8cd3b0d1820253d146cdff2d36070
|
|
| MD5 |
b18d1331c75c33b1834c5fdc96ba66b4
|
|
| BLAKE2b-256 |
944315020170a2b3ce641ba6be20751819f115142f6cddd0096b7bb6b0d50aad
|