Skip to main content

Persistent JSON caching for Python with async support - cache function results and object state effortlessly.

Project description

Cacherator

Persistent JSON caching for Python with async support - Cache function results and object state effortlessly.

Python 3.7+ License: MIT

Overview

Cacherator is a Python library that provides persistent JSON-based caching for class state and function results. It enables developers to cache expensive operations with minimal configuration, supporting both synchronous and asynchronous functions.

Key Features

  • Zero-configuration caching - Simple inheritance and decorator pattern
  • Async/await support - Native support for asynchronous functions
  • Persistent storage - Cache survives program restarts
  • TTL (Time-To-Live) - Automatic cache expiration
  • Selective caching - Fine-grained control over what gets cached
  • Cache management - Built-in methods for inspection and clearing
  • Flexible logging - Global and per-instance control

Installation

pip install cacherator

Quick Start

Basic Function Caching

from cacherator import JSONCache, Cached
import time

class Calculator(JSONCache):
    def __init__(self):
        super().__init__(data_id="calc")
    
    @Cached()
    def expensive_calculation(self, x, y):
        time.sleep(2)  # Simulate expensive operation
        return x ** y

calc = Calculator()
result = calc.expensive_calculation(2, 10)  # Takes 2 seconds
result = calc.expensive_calculation(2, 10)  # Instant!

Async Function Caching

class APIClient(JSONCache):
    @Cached(ttl=1)  # Cache for 1 day
    async def fetch_user(self, user_id):
        # Expensive API call
        response = await api.get(f"/users/{user_id}")
        return response.json()

client = APIClient()
user = await client.fetch_user(123)  # API call
user = await client.fetch_user(123)  # Cached!

State Persistence

class GameState(JSONCache):
    def __init__(self, game_id):
        super().__init__(data_id=f"game_{game_id}")
        if not hasattr(self, "score"):
            self.score = 0
            self.level = 1
    
    def add_points(self, points):
        self.score += points
        self.json_cache_save()

# Session 1
game = GameState("player1")
game.add_points(100)

# Session 2 (after restart)
game = GameState("player1")
print(game.score)  # 100 - persisted!

Advanced Usage

Custom TTL Configuration

class WeatherService(JSONCache):
    @Cached(ttl=0.25)  # 6 hours (0.25 days)
    def get_forecast(self, city):
        return fetch_weather(city)
    
    @Cached(ttl=30)  # 30 days
    def get_historical(self, city, year):
        return fetch_historical(city, year)

Excluding Variables from Cache

class DataProcessor(JSONCache):
    def __init__(self):
        self._excluded_cache_vars = ["temp_data", "api_key"]
        super().__init__()
        self.results = {}
        self.temp_data = []  # Won't be cached
        self.api_key = "secret"  # Won't be cached

Cache Management

processor = DataProcessor()

# Get cache statistics
stats = processor.json_cache_stats()
print(stats)
# {'total_entries': 5, 'functions': {'process': 3, 'analyze': 2}}

# Clear specific function cache
processor.json_cache_clear("process")

# Clear all cache
processor.json_cache_clear()

Logging Control

# Disable logging globally
from cacherator import JSONCache
JSONCache.set_logging(False)

# Or per instance
processor = DataProcessor(logging=False)

Configuration

JSONCache Constructor

JSONCache(
    data_id="unique_id",      # Unique identifier (default: class name)
    directory="cache",         # Cache directory (default: "data/cache")
    clear_cache=False,         # Clear existing cache on init
    ttl=999,                   # Default TTL in days
    logging=True               # Enable logging
)

@Cached Decorator

@Cached(
    ttl=7,                     # Time-to-live in days (default: class ttl)
    clear_cache=False          # Clear cache for this function
)

Use Cases

API Client with Caching

class GitHubClient(JSONCache):
    def __init__(self):
        super().__init__(data_id="github_client", ttl=1)
    
    @Cached(ttl=0.5)  # 12 hours
    async def get_user(self, username):
        async with aiohttp.ClientSession() as session:
            async with session.get(f"https://api.github.com/users/{username}") as resp:
                return await resp.json()
    
    @Cached(ttl=7)  # 1 week
    async def get_repos(self, username):
        async with aiohttp.ClientSession() as session:
            async with session.get(f"https://api.github.com/users/{username}/repos") as resp:
                return await resp.json()

Database Query Caching

class UserRepository(JSONCache):
    def __init__(self):
        super().__init__(data_id="user_repo", ttl=0.1)  # 2.4 hours
    
    @Cached()
    def get_user_by_id(self, user_id):
        return db.query("SELECT * FROM users WHERE id = ?", user_id)
    
    @Cached(ttl=1)
    def get_user_stats(self, user_id):
        return db.query("SELECT COUNT(*) FROM posts WHERE user_id = ?", user_id)

Machine Learning Model Predictions

class ModelPredictor(JSONCache):
    def __init__(self):
        super().__init__(data_id="ml_predictor")
        self.model = load_model()
    
    @Cached(ttl=30)
    def predict(self, features_hash, features):
        # Cache predictions by feature hash
        return self.model.predict(features)

Best Practices

Recommended Use Cases

  • Expensive API calls and network requests
  • Database queries with relatively static data
  • Heavy computational operations
  • Machine learning model predictions
  • Data transformations and aggregations

When to Use TTL

  • Set short TTL (minutes to hours) for frequently changing data
  • Set long TTL (days to weeks) for stable reference data
  • Consider data freshness requirements for your application

What Not to Cache

  • Non-deterministic functions (random number generation, timestamps)
  • Very fast operations (overhead exceeds benefit)
  • Non-JSON-serializable objects without custom handling
  • Real-time data without appropriate TTL configuration

Performance

Cacherator introduces minimal overhead:

  • Cache hit: ~0.1ms
  • Cache miss: Function execution time + ~1ms
  • Disk I/O: Non-blocking, asynchronous operations

Performance Improvements

  • API calls (100ms - 5s) reduced to ~0.1ms
  • Database queries (10ms - 1s) reduced to ~0.1ms
  • Heavy computations (1s+) reduced to ~0.1ms

Compatibility

  • Python: 3.7 and above
  • Async: Full support for async/await syntax
  • Operating Systems: Windows, macOS, Linux
  • Data Types: All JSON-serializable types plus datetime objects

Troubleshooting

Cache Not Persisting

# Explicitly save cache
obj.json_cache_save()

# Check for serialization errors
obj._excluded_cache_vars = ["problematic_attr"]

Cache Not Being Used

# Verify TTL hasn't expired
obj = MyClass(ttl=30)  # Increase TTL

# Ensure arguments are identical (type matters)
obj.func(1, 2)    # Different from
obj.func(1.0, 2)  # (int vs float)

Large Cache Files

# Exclude large attributes
self._excluded_cache_vars = ["large_data"]

# Use separate cache instances
processor1 = DataProcessor(data_id="dataset1")
processor2 = DataProcessor(data_id="dataset2")

Contributing

Contributions are welcome. Please see CONTRIBUTING.md for guidelines.

License

MIT License - see LICENSE file for details.

Resources


Developed by Arved Klöhn

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

cacherator-1.1.0.tar.gz (16.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

cacherator-1.1.0-py3-none-any.whl (11.3 kB view details)

Uploaded Python 3

File details

Details for the file cacherator-1.1.0.tar.gz.

File metadata

  • Download URL: cacherator-1.1.0.tar.gz
  • Upload date:
  • Size: 16.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.12

File hashes

Hashes for cacherator-1.1.0.tar.gz
Algorithm Hash digest
SHA256 64a68590dfd50501e7e8f9b375175b4fc1cf87716d4a5c6c3a5349ca81047af8
MD5 867422e355eb03a8456ea0f6853a612d
BLAKE2b-256 cecc434c5dc9deb59b471977738ba4f58bca3787faef4aaecca17a700d855448

See more details on using hashes here.

File details

Details for the file cacherator-1.1.0-py3-none-any.whl.

File metadata

  • Download URL: cacherator-1.1.0-py3-none-any.whl
  • Upload date:
  • Size: 11.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.12

File hashes

Hashes for cacherator-1.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 fa1c62af6735a09476284242d17a17c6d2c4400c91affa5fc3a0691806a698fe
MD5 e57f78660da043cc61dfa28b7c6d9623
BLAKE2b-256 51902691c14aa5f4a265c41edef110f216d7324f2a478b77afb228c2cc0f3263

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page