Skip to main content

A flexible caching decorator for Flask that supports Replit, Redis, and in-memory storage.

Project description

Cachecade

Cachecade is a flexible caching package for Flask applications that supports multiple backends with a prioritized fallback mechanism. Out of the box, it supports:

  • Replit Key–Value Store
  • Redis
  • In-Memory Caching

Features

  • Prioritized Storage Engines: By default, Cachecade uses ['replit', 'redis', 'memory'] as the order of precedence.
  • TTL Support: Cache entries have a time-to-live (TTL) after which they are considered stale.
  • Decorator-Based Caching: Easily cache function results by using the provided decorator.

Installation

Clone this repository and install it with pip:

pip install .

Usage

Basic Usage

Initialize the cache at the start of your application:

from flask import Flask
from cachecade import init_cache, cachecaded

app = Flask(__name__)

# Initialize with default settings (tries Replit DB, then Redis, then memory)
init_cache()

@app.route('/data')
@cachecaded(ttl=60)  # Cache results for 60 seconds
def get_data():
    # Your expensive data retrieval operation here
    return {"result": "some data"}

Custom Storage Configuration

Specify the order of storage engines to try:

# Prefer Redis, fall back to memory (skip Replit)
init_cache(storage_engines=['redis', 'memory'])

# Use only in-memory caching
init_cache(storage_engines=['memory'])

Using Cache Prefix

Adding a prefix to your cache keys helps with namespacing and prevents collisions:

# All cache keys will be prefixed with "myapp:"
init_cache(prefix="myapp")

@app.route('/user/<user_id>')
@cachecaded(ttl=300)  # Cache for 5 minutes
def get_user(user_id):
    # The actual cache key will include the "myapp:" prefix
    return {"user_id": user_id, "name": "Example User"}

Using with Flask Blueprints

Cachecade works seamlessly with Flask blueprints:

from flask import Flask, Blueprint
from cachecade import init_cache, cachecaded

app = Flask(__name__)
# Initialize cache with a prefix for this specific app
init_cache(prefix="myservice")

# Create a blueprint for API endpoints
api = Blueprint('api', __name__, url_prefix='/api')

@api.route('/users')
@cachecaded(ttl=120)  # Cache for 2 minutes
def get_users():
    # This function's results will be cached
    return {"users": ["Alice", "Bob", "Charlie"]}

@api.route('/products')
@cachecaded(ttl=300)  # Cache for 5 minutes
def get_products():
    # This function's results will also be cached
    return {"products": ["Product A", "Product B"]}

# Register the blueprint with the app
app.register_blueprint(api)

if __name__ == '__main__':
    app.run(debug=True)

Blueprints in Separate Files

For larger applications, it's common to organize blueprints in separate files or modules. Here's how to use Cachecade in this scenario:

Project Structure

myapp/
  ├── __init__.py         # Main application factory
  ├── blueprints/
  │   ├── __init__.py
  │   ├── users.py        # Users blueprint
  │   └── products.py     # Products blueprint
  └── app.py              # Application entry point

Main Application (myapp/__init__.py)

from flask import Flask
from cachecade import init_cache

def create_app():
    app = Flask(__name__)
    
    # Initialize cache with a prefix for the entire app
    init_cache(prefix="myapp")
    
    # Register blueprints
    from myapp.blueprints.users import users_bp
    from myapp.blueprints.products import products_bp
    
    app.register_blueprint(users_bp)
    app.register_blueprint(products_bp)
    
    return app

Users Blueprint (myapp/blueprints/users.py)

from flask import Blueprint, jsonify
from cachecade import cachecaded

users_bp = Blueprint('users', __name__, url_prefix='/users')

@users_bp.route('/')
@cachecaded(ttl=300)  # Cache for 5 minutes
def get_users():
    # Expensive database query simulation
    users = [{"id": 1, "name": "Alice"}, {"id": 2, "name": "Bob"}]
    return jsonify(users)

@users_bp.route('/<int:user_id>')
@cachecaded(ttl=180)  # Cache for 3 minutes
def get_user(user_id):
    # The prefix is applied from the init_cache() call in the main app
    return jsonify({"id": user_id, "name": f"User {user_id}"})

Products Blueprint (myapp/blueprints/products.py)

from flask import Blueprint, jsonify
from cachecade import cachecaded

products_bp = Blueprint('products', __name__, url_prefix='/products')

@products_bp.route('/')
@cachecaded(ttl=600)  # Cache for 10 minutes
def get_products():
    # Expensive database query simulation
    products = [{"id": 1, "name": "Product A"}, {"id": 2, "name": "Product B"}]
    return jsonify(products)

Application Entry Point (myapp/app.py)

from myapp import create_app

app = create_app()

if __name__ == '__main__':
    app.run(debug=True)

In this structure:

  1. The cache is initialized once in the application factory
  2. The cache prefix is defined globally for the entire application
  3. Each blueprint is in its own file, but they all use the same cache instance
  4. The cachecaded decorator works seamlessly across all blueprints

Advanced Example with Environment Configuration

import os
from flask import Flask
from cachecade import init_cache, cachecaded

app = Flask(__name__)

# Set up cache based on environment
if os.environ.get('ENVIRONMENT') == 'production':
    # In production, try Redis first, then fall back to memory
    init_cache(storage_engines=['redis', 'memory'], prefix="prod")
else:
    # In development, just use memory caching
    init_cache(storage_engines=['memory'], prefix="dev")

@app.route('/data')
@cachecaded(ttl=60)
def get_data():
    return {"status": "success"}

Note on Redis Configuration

When using Redis as a caching backend, make sure to set the REDIS_URL environment variable:

export REDIS_URL="redis://localhost:6379/0"

Or in your application code:

import os
os.environ['REDIS_URL'] = "redis://localhost:6379/0"

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

cachecade-0.2.0.tar.gz (8.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

cachecade-0.2.0-py3-none-any.whl (5.8 kB view details)

Uploaded Python 3

File details

Details for the file cachecade-0.2.0.tar.gz.

File metadata

  • Download URL: cachecade-0.2.0.tar.gz
  • Upload date:
  • Size: 8.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.12.9

File hashes

Hashes for cachecade-0.2.0.tar.gz
Algorithm Hash digest
SHA256 16868a3657884324ed35f6952662fa6d7c1303d2abf36b06ee15b78a36ec7ed4
MD5 2e41a9c1180fbcbd52e9b3b4ec32ee93
BLAKE2b-256 4b57898b8499a321de433c892e95c50f7f21ea092fa239f8d86f51b122139ccc

See more details on using hashes here.

Provenance

The following attestation bundles were made for cachecade-0.2.0.tar.gz:

Publisher: publish-to-pypi.yml on ppicazo/cachecade

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file cachecade-0.2.0-py3-none-any.whl.

File metadata

  • Download URL: cachecade-0.2.0-py3-none-any.whl
  • Upload date:
  • Size: 5.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.12.9

File hashes

Hashes for cachecade-0.2.0-py3-none-any.whl
Algorithm Hash digest
SHA256 f2df8cf203c06b7ca2327c41f9e896b45137d042d87ce9d9544f565ae1ed3426
MD5 cc9dcac8fff578eb95723e68a3f095f4
BLAKE2b-256 ff7989498380efa47061e03d0a03e5a666b560f72ecc142bc3879f6d70dfb282

See more details on using hashes here.

Provenance

The following attestation bundles were made for cachecade-0.2.0-py3-none-any.whl:

Publisher: publish-to-pypi.yml on ppicazo/cachecade

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page