Skip to main content

A Redis cache decorator for crewAI tasks

This project has been archived.

The maintainers of this project have marked this project as archived. No new releases are expected.

Project description

crewai-cache-hook

PyPI version Python Versions License: MIT

A Redis-based cache decorator for crewai tasks.

Project Status

  • Version: 0.1.1
  • Status: Beta
  • Maintained: Yes

Features

  • Add cache capability to tasks via a decorator
  • Checks Redis cache before task execution, returns cached result if hit
  • Automatically writes task result to Redis cache after execution
  • Supports custom cache key, expiration time, and Redis connection parameters
  • Robust error handling and fallback mechanisms
  • Multiple serialization options (pickle, JSON)

Prerequisites

Redis Dependency

This library requires Redis as a backend for caching. Make sure you have Redis installed and running.

Installing Redis

  • MacOS: brew install redis
  • Ubuntu/Debian: sudo apt-get install redis-server
  • Windows: Download from Redis Official Website

Starting Redis Server

  • MacOS: brew services start redis
  • Ubuntu/Debian: sudo systemctl start redis
  • Manual: redis-server

Installation

# Install via pip
pip install crewai-cache-hook

# Ensure Redis is installed and running

Configuration

Basic Usage

from cache_hook import cache_hook

@task
@cache_hook(expire=600)  # Cache for 10 minutes
def my_task(self):
    ...

Advanced Configuration

from cache_hook import CacheHook
import logging

# Configure logging
logging.basicConfig(level=logging.INFO)

# Custom Redis configuration
custom_cache = CacheHook(
    host='redis.example.com',     # Redis host
    port=6379,                    # Redis port
    db=1,                         # Redis database
    password='secret',            # Optional password
    connect_timeout=5,            # Connection timeout
    max_retries=2,                # Connection retry attempts
    serializer='json'             # Serialization method
)
custom_cache_hook = custom_cache.cache_hook

@task
@custom_cache_hook(expire=600)
def my_task(self):
    ...

Advanced Features

Custom Cache Key Generator

from cache_hook import cache_hook

def custom_key_generator(func, args, kwargs):
    """
    Create a custom cache key based on specific logic
    
    Args:
        func: The function being cached
        args: Positional arguments
        kwargs: Keyword arguments
    
    Returns:
        str: Custom cache key
    """
    # Example: Include specific arguments in cache key
    key_parts = [
        func.__name__,
        str(kwargs.get('destination', '')),
        str(kwargs.get('start_date', ''))
    ]
    return ':'.join(key_parts)

@task
@cache_hook(expire=600, cache_key_func=custom_key_generator)
def travel_analysis_task(self, destination, start_date):
    ...

Force Refresh

@task
@cache_hook(expire=600, force_refresh=True)
def always_fresh_task(self):
    # This task will always execute and update cache
    ...

Serialization Options

# Pickle serialization (default)
cache_hook(serializer='pickle')

# JSON serialization (for more compatibility)
cache_hook(serializer='json')

Parameters

  • expire: Cache expiration time (seconds)
  • cache_key_func: Custom function to generate cache key (optional)
  • force_refresh: Force task execution and cache update
  • host/port/db/password: Redis connection parameters
  • serializer: Serialization method ('pickle' or 'json')
  • connect_timeout: Connection timeout in seconds
  • max_retries: Number of connection retry attempts

Notes

  • Uses configurable serialization (pickle or JSON)
  • Ensure task results are serializable
  • Suitable for crewAI tasks
  • Caches the entire Task result
  • Requires an active Redis server

Troubleshooting

  • Connection Errors: Check Redis server status, network, and configuration
  • Serialization Errors: Ensure task results are serializable
  • Performance Issues: Monitor cache size and hit/miss rates

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

crewai_cache_hook-0.1.1.tar.gz (7.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

crewai_cache_hook-0.1.1-py3-none-any.whl (6.3 kB view details)

Uploaded Python 3

File details

Details for the file crewai_cache_hook-0.1.1.tar.gz.

File metadata

  • Download URL: crewai_cache_hook-0.1.1.tar.gz
  • Upload date:
  • Size: 7.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.5

File hashes

Hashes for crewai_cache_hook-0.1.1.tar.gz
Algorithm Hash digest
SHA256 d740eb08e50a8427c0da98bceab4fa71a57640d76178805ed87402d27491b750
MD5 6fd6582fcf0a58854644f096f883d003
BLAKE2b-256 c4be4eff05e7beeef6de56bb9fa062af33bf7011ff7316bdb169e847a9ee088b

See more details on using hashes here.

File details

Details for the file crewai_cache_hook-0.1.1-py3-none-any.whl.

File metadata

File hashes

Hashes for crewai_cache_hook-0.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 3665789d4aa4010b0e8b857337043c876099bb6eab7d2e14ac8f6901286fd423
MD5 c7e30dd50a3bbf43ee3ebd0df4079685
BLAKE2b-256 ee1f407e7b3a9bee5b984dd3f208ecb23ae4cd3f324145d37587a78225c0fbb3

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page