A Python library for persistent JSON-based caching of class state and function results.
Project description
Cacherator
Cacherator is a Python package that provides persistent JSON-based caching for class state and function results. It enables significant performance improvements by caching expensive computations and preserving object state between program executions.
Installation
You can install Cacherator using pip:
pip install cacherator
Features
- Persistent caching of function results
- Customizable Time-To-Live (TTL) for cached data
- Option to clear cache on demand
- JSON-based storage for easy inspection and portability
- Automatic serialization and deserialization of cached data
- Support for instance methods and properties
Core Components
1. JSONCache (Base Class)
The foundation class that enables persistent caching of object state.
from cacherator import JSONCache
class MyClass(JSONCache):
def __init__(self, data_id=None):
super().__init__(data_id=data_id)
# Your initialization code here
Constructor Parameters
| Parameter | Type | Default | Description |
|---|---|---|---|
data_id |
str |
Class name | Unique identifier for the cache file |
directory |
str |
"json/data" | Directory for storing cache files |
clear_cache |
bool |
False |
Whether to clear existing cache on initialization |
ttl |
timedelta | int | float |
999 (days) | Default time-to-live for cached items |
logging |
bool |
True |
Whether to enable logging of cache operations |
Key Methods
json_cache_save(): Manually save the current state to the cache file
2. Cached Decorator
Decorator for caching results of instance methods.
from cacherator import JSONCache, Cached
class MyClass(JSONCache):
@Cached(ttl=30, clear_cache=False)
def expensive_calculation(self, param1, param2):
# Expensive computation here
return result
Parameters
| Parameter | Type | Default | Description |
|---|---|---|---|
ttl |
timedelta | int | float |
None (uses class ttl) | Time-to-live for cached results |
clear_cache |
bool |
False |
Whether to clear existing cache for this function |
Usage Patterns
Basic Usage
from cacherator import JSONCache, Cached
import time
class DataProcessor(JSONCache):
def __init__(self, dataset_id):
super().__init__(data_id=f"processor_{dataset_id}")
self.dataset_id = dataset_id
@Cached()
def process_data(self, threshold=0.5):
print("Processing data (expensive operation)...")
time.sleep(2) # Simulate expensive computation
return [i for i in range(10) if i/10 > threshold]
# First run - will execute and cache
processor = DataProcessor("dataset1")
result1 = processor.process_data(0.3) # Executes the function
# Second run - will use cache
processor2 = DataProcessor("dataset1")
result2 = processor2.process_data(0.3) # Returns cached result
# Different arguments - new cache entry
result3 = processor2.process_data(0.7) # Executes the function
Cache Clearing
# Clear specific function cache
processor = DataProcessor("dataset1")
result = processor.process_data(0.3, clear_cache=True) # Force recomputation
# Clear all cache for an object
processor = DataProcessor("dataset1", clear_cache=True) # Clear entire object cache
Custom TTL
from datetime import timedelta
class WeatherService(JSONCache):
def __init__(self, location):
# Cache weather data for 1 day by default
super().__init__(data_id=f"weather_{location}", ttl=1)
self.location = location
# Cache forecast for only 6 hours
@Cached(ttl=0.25) # 0.25 days = 6 hours
def get_forecast(self):
# API call to weather service
pass
# Cache historical data for 30 days
@Cached(ttl=30)
def get_historical_data(self, start_date, end_date):
# API call to weather service
pass
State Persistence
class GameState(JSONCache):
def __init__(self, game_id):
super().__init__(data_id=f"game_{game_id}")
# Default values for new games
if not hasattr(self, "score"):
self.score = 0
if not hasattr(self, "level"):
self.level = 1
def increase_score(self, points):
self.score += points
self.json_cache_save() # Explicitly save state
def level_up(self):
self.level += 1
# No explicit save needed, will be saved on garbage collection
Custom Directory
import os
class UserProfile(JSONCache):
def __init__(self, user_id):
cache_dir = os.path.join("data", "users", user_id[:2])
super().__init__(
data_id=user_id,
directory=cache_dir
)
Excluding Variables from Cache
class AnalysisEngine(JSONCache):
def __init__(self, project_id):
self._excluded_cache_vars = ["temp_data", "sensitive_info"]
super().__init__(data_id=project_id)
self.project_id = project_id
self.results = {}
self.temp_data = [] # Will not be cached due to exclusion
self.sensitive_info = {} # Will not be cached due to exclusion
Best Practices
When to Use Cacherator
- DO use for expensive computations that are called repeatedly with the same parameters
- DO use for preserving application state between runs
- DO use for reducing API calls or database queries
- DO use when results can be serialized to JSON
When Not to Use Cacherator
- DON'T use for functions with non-deterministic results (e.g., random generators)
- DON'T use for time-sensitive operations where fresh data is critical
- DON'T use for functions with non-serializable results
- DON'T use for very simple or fast operations where caching overhead exceeds benefits
Performance Considerations
- Set appropriate TTL values based on data freshness requirements
- Be aware of disk I/O overhead for frequent cache saves
- Consider excluding large or frequently changing attributes with
_excluded_cache_vars - Use dedicated cache directories for better organization and performance
Error Handling
Cacherator gracefully handles common errors:
- Missing cache files (creates new cache)
- Permission errors (logs error and continues)
- JSON parsing errors (logs error and continues)
- Non-serializable objects (excludes from cache)
Common Issues and Solutions
Issue: Cache Not Being Saved
Possible causes:
- Object is not being garbage collected
- Errors during serialization
Solutions:
- Explicitly call
json_cache_save()at key points - Check for non-serializable attributes and exclude them with
_excluded_cache_vars
Issue: Cache Not Being Used
Possible causes:
- Function arguments differ slightly (e.g., floats vs integers)
- TTL has expired
clear_cache=Trueis being used
Solutions:
- Standardize argument types before passing to cached functions
- Increase TTL if appropriate
- Remove
clear_cache=Trueparameter or use conditionally
Issue: Large Cache Files
Possible causes:
- Caching large data structures
- Many function calls with different parameters
Solutions:
- Use
_excluded_cache_varsfor large attributes - Create separate cache instances for different data sets
Security Considerations
-
Sensitive Data: Avoid caching sensitive information like passwords or API keys
- Either exclude them with
_excluded_cache_vars - Or encrypt them before storing
- Either exclude them with
-
File Permissions: Cache files are stored as regular files
- Ensure proper file permissions on cache directories
- Consider using more secure storage for sensitive applications
-
TTL for Sensitive Operations: Use shorter TTLs for operations with security implications
- Authentication tokens
- User permissions
- Security settings
Compatibility Notes
Cacherator is compatible with:
- Python 3.7+
- All major operating systems (Windows, macOS, Linux)
- Common serializable Python data types (dict, list, str, int, float, bool, etc.)
- datetime objects (via DateTimeEncoder)
- Most standard library classes that are JSON-serializable
License
This project is licensed under the MIT License.
Dependencies
- python-slugify
- logorator
Links
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file cacherator-1.0.11.tar.gz.
File metadata
- Download URL: cacherator-1.0.11.tar.gz
- Upload date:
- Size: 13.6 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.13.5
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
72ab5d222a13541e0b8dcb9cb66691abec8ad6e049e09c269bc206617e74beea
|
|
| MD5 |
22991d29951232dd82c285fb7b8ac43f
|
|
| BLAKE2b-256 |
45df5e8c7505c693f12c95b6b6a602614ff99d2af0fcbab05b11f16d8e592cf2
|
File details
Details for the file cacherator-1.0.11-py3-none-any.whl.
File metadata
- Download URL: cacherator-1.0.11-py3-none-any.whl
- Upload date:
- Size: 11.8 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.13.5
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
b052648e55d9e8e85f566dc2a43fe6f64614fbf557fbdd7c84d5f8128922176d
|
|
| MD5 |
3060c36245924a9662cf35ffb580d34c
|
|
| BLAKE2b-256 |
2dfa952878cff4406fc57cb977463c8a96d9f15b0582706bd31a52354dae2132
|