cacheio is a flexible, user-friendly Python caching interface that unifies synchronous and asynchronous caching backends. It provides a consistent API for interacting with proven libraries—cachelib for synchronous caching and aiocache for asynchronous caching—allowing seamless integration of both styles in your applications. With configurable defaults, optional backend installation, and easy-to-use decorators, cacheio simplifies caching logic while minimizing dependencies.
Project description
cacheio
A flexible and user-friendly Python caching interface that provides a unified API for both synchronous and asynchronous caching, by wrapping and integrating two well-established caching libraries:
cacheio simplifies caching in Python applications by providing a consistent API for both sync and async use cases — no need to learn two different interfaces or manage separate dependencies manually. It intelligently loads only the backend dependencies you need.
Overview 🚀
cacheio is not just another caching library. It is a unified abstraction layer that seamlessly bridges the gap between synchronous and asynchronous caching in Python by integrating:
cachelibfor reliable, performant synchronous caching backends, including in-memory caches.aiocachefor flexible, feature-rich asynchronous caching, with support for multiple backends like Redis and Memcached.
This design means you can switch between sync and async caching or use both in the same codebase with a shared, consistent API.
Installation
Install cacheio via pip. The library uses optional dependency groups to help you install only what you need.
Basic Installation
Install the core package without any caching backends:
pip install cacheio
Installing with Backends
Choose the optional group(s) based on your use case:
-
Synchronous Caching: Use the
syncextra to installcacheliband synchronous cache support.pip install "cacheio[sync]"
-
Asynchronous Caching: Use the
asyncextra to installaiocacheand asynchronous cache support.pip install "cacheio[async]"
-
Full Installation: Install both synchronous and asynchronous backends together.
pip install "cacheio[full]"
Configuration
cacheio exposes a global configuration object, allowing you to customize default settings like the time-to-live (TTL) for cached entries and cache size thresholds.
You can modify the configuration by importing config directly or by using the configure function which accepts a callable that modifies the global config.
Using the Global config Object
You can read or update configuration parameters directly:
from cacheio import config
print(config.default_ttl) # Default TTL in seconds (e.g., 300)
# Update the default TTL to 600 seconds (10 minutes)
config.default_ttl = 600
Using the configure Helper
Use the configure function to update configuration settings in a safe and explicit manner:
from cacheio import configure
def update_config(cfg):
cfg.default_ttl = 600
cfg.default_threshold = 1000
configure(update_config)
This pattern can be handy for centralized configuration setup in your application.
Quick Start
Synchronous Caching
Use CacheFactory.memory_cache() to get a synchronous cache adapter backed by cachelib.SimpleCache. The adapter respects your configured TTL and cache size threshold by default.
from cacheio import CacheFactory
# Create a synchronous in-memory cache adapter
my_cache = CacheFactory.memory_cache()
# Set a value with TTL 300 seconds (or your configured TTL)
my_cache.set("my_key", "my_value", ttl=300)
# Retrieve the cached value
value = my_cache.get("my_key")
print(f"Retrieved value: {value}")
Asynchronous Caching
Use CacheFactory.async_memory_cache() to get an asynchronous cache adapter backed by aiocache.Cache, honoring the TTL from the config.
import asyncio
from cacheio import CacheFactory
async def main():
# Create an asynchronous in-memory cache adapter
my_async_cache = CacheFactory.async_memory_cache()
# Set a value with TTL 300 seconds asynchronously
await my_async_cache.set("my_async_key", "my_async_value", ttl=300)
# Retrieve the cached value asynchronously
async_value = await my_async_cache.get("my_async_key")
print(f"Retrieved async value: {async_value}")
if __name__ == "__main__":
asyncio.run(main())
Usage Examples
1. Synchronous Caching with the cached Decorator
cacheio provides decorators like @cached to simplify memoization of synchronous methods. In this example, we define a class inheriting from Cacheable that sets up a default cachelib-based in-memory cache.
import time
from cacheio import cached
from cacheio.mixins import Cacheable
class UserService(Cacheable):
@cached(key_fn=lambda self, user_id, **kwargs: f"user:{user_id}", ttl=60)
def fetch_user(self, user_id: int, request_id: str) -> dict:
print(f"Fetching user {user_id} from database...")
time.sleep(2) # Simulate delay
return {"id": user_id, "name": f"User_{user_id}", "request": request_id}
user_service = UserService()
print("First call:")
user_1 = user_service.fetch_user(user_id=1, request_id="req-1")
print(f"Result: {user_1}\n")
print("Second call (cached):")
user_2 = user_service.fetch_user(user_id=1, request_id="req-1")
print(f"Result: {user_2}\n")
print("Third call (different request_id, still cached):")
user_3 = user_service.fetch_user(user_id=1, request_id="req-2")
print(f"Result: {user_3}")
2. Asynchronous Caching with the async_cached Decorator
Similarly, @async_cached works for async methods and uses the AsyncCacheable mixin, which sets up an aiocache in-memory backend.
import asyncio
from cacheio import async_cached
from cacheio.mixins import AsyncCacheable
class AsyncUserService(AsyncCacheable):
@async_cached(key_fn=lambda self, user_id, **kwargs: f"user:{user_id}", ttl=60)
async def fetch_user(self, user_id: int, request_id: str) -> dict:
print(f"Fetching user {user_id} asynchronously...")
await asyncio.sleep(2) # Simulate async delay
return {"id": user_id, "name": f"User_{user_id}", "request": request_id}
async def main():
user_service = AsyncUserService()
print("First call:")
user_1 = await user_service.fetch_user(user_id=1, request_id="req-1")
print(f"Result: {user_1}\n")
print("Second call (cached):")
user_2 = await user_service.fetch_user(user_id=1, request_id="req-1")
print(f"Result: {user_2}\n")
print("Third call (different request_id, still cached):")
user_3 = await user_service.fetch_user(user_id=1, request_id="req-2")
print(f"Result: {user_3}")
if __name__ == "__main__":
asyncio.run(main())
Why cacheio?
- Unified API for sync and async caching.
- Seamless backend integration with two proven libraries:
cachelibandaiocache. - Minimal dependencies — install only what you need.
- Simple decorators and mixins to add caching effortlessly.
- Configurable defaults with a global config object and helper.
- Flexible TTL and backend options via factory methods.
Contributing
Contributions are welcome! Open an issue or submit a pull request on the GitHub repository.
License
cacheio is licensed under the MIT License. See LICENSE for details.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file cacheio-0.1.1.tar.gz.
File metadata
- Download URL: cacheio-0.1.1.tar.gz
- Upload date:
- Size: 13.3 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.8.2 CPython/3.10.18 Linux/6.11.0-1018-azure
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
1135cf2b20bfe7e70fe2bf7f5ecb91bad8898d757f899c13562ac8442cf03d75
|
|
| MD5 |
2d17523d5e96bd4a4496831157d584dc
|
|
| BLAKE2b-256 |
4a3f34494397daf4dda08bd8dfb8e876f124a67deea8c174a8b30b3b8be90f3b
|
File details
Details for the file cacheio-0.1.1-py3-none-any.whl.
File metadata
- Download URL: cacheio-0.1.1-py3-none-any.whl
- Upload date:
- Size: 18.2 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.8.2 CPython/3.10.18 Linux/6.11.0-1018-azure
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
8aac428b03c2556849958e56094b54d5f3d6cce492bc0d07df23804063aef265
|
|
| MD5 |
c796005a591a4f7307456ad235f121ae
|
|
| BLAKE2b-256 |
c2349e3cd3d48a2d57b7a27d3516db917ab85ec8485f3f570103dbd0ab9b67b5
|