Skip to main content

High-performance caching library with async support

Project description

🚀 PyFCach - Blazing Fast Python Caching Library

Apache 2.0 License Python 3.8+ Performance

PyFCach is a high-performance, feature-rich caching library for Python that delivers enterprise-grade caching solutions with incredible speed and flexibility.

✨ Features

🎯 Multiple Eviction Strategies

  • LRU (Least Recently Used)
  • MRU (Most Recently Used)
  • LFU (Least Frequently Used)
  • TTL (Time-to-Live)
  • ARC (Adaptive Replacement Cache)

⚡ Performance Optimizations

  • Sharded locking for maximum concurrency
  • Memory-optimized storage with compression
  • Async/await support
  • Zero-copy serialization with msgspec
  • XXHash for ultra-fast key generation

🔧 Advanced Features

  • Memory usage tracking and automatic eviction
  • Built-in performance profiling
  • Compression for large values
  • Thread-safe and async-ready
  • Decorator-based caching
  • Cached properties with TTL

🚀 Quick Start

Installation

pip install pyfcach

Basic Usage

from pyfcach import cache, CacheStrategy, get_global_stats

@cache(maxsize=1000, strategy=CacheStrategy.LRU)
def expensive_function(x: int, y: int) -> int:
    return x * y + x // y

# The result is cached automatically!
result1 = expensive_function(10, 5)
result2 = expensive_function(10, 5)  # Returns cached result

# Get cache statistics
print(expensive_function.cache_info())

Async Support

from pyfcach import async_cache
import asyncio

@async_cache(maxsize=500, strategy=CacheStrategy.TTL, ttl=60)
async def fetch_data(url: str) -> dict:
    # Simulate API call
    await asyncio.sleep(1)
    return {"data": "result"}

async def main():
    data = await fetch_data("https://api.example.com/data")
    # Subsequent calls within 60 seconds return cached data

Advanced Configuration

from pyfcach import HighPerformanceCache, CacheStrategy

# Create a custom cache instance
cache = HighPerformanceCache(
    maxsize=10000,
    strategy=CacheStrategy.ARC,
    memory_limit=1024 * 1024 * 100,  # 100MB
    compress=True,
    enable_profiling=True
)

cache.set("key", {"complex": "object"}, ttl=3600)
value = cache.get("key")

Cached Properties

from pyfcach import cached_property, AsyncCachedProperty

class DataProcessor:
    @cached_property(ttl=300)  # Cache for 5 minutes
    def processed_data(self):
        # Expensive computation
        return self._heavy_computation()
    
    @AsyncCachedProperty(ttl=60)
    async def async_data(self):
        # Async operation
        return await self._fetch_remote_data()

📊 Performance

PyFCach is built for speed:

· Microsecond-level operations with optimized data structures · Sharded locking eliminates contention · Memory-efficient storage with automatic compression · Zero-dependency core with optional optimizations

🛠️ Configuration

Cache Strategies

Strategy Best For Features LRU General purpose Predictable, good hit rates LFU Frequency-based access Optimizes for popular items TTL Time-sensitive data Automatic expiration ARC Adaptive workloads Self-tuning, best of LRU/LFU MRU Special cases Certain access patterns

Memory Management

# Automatic memory management
cache = HighPerformanceCache(
    memory_limit=1024 * 1024 * 50,  # 50MB limit
    compress_threshold=1024,  # Compress values >1KB
)

# Global memory tracking
stats = get_global_stats()
print(f"Total memory used: {stats['total_memory_mb']:.2f} MB")

🔍 Monitoring & Profiling

# Enable profiling
@cache(enable_profiling=True)
def profiled_function():
    pass

# Get performance insights
print(profiled_function.cache_profile())

# Global statistics
global_stats = get_global_stats()
print(f"Global hit rate: {global_stats['global_hit_rate']:.2%}")

📈 Benchmarks

PyFCach outperforms traditional caching solutions:

· 3-5x faster than functools.lru_cache · 2-3x faster than popular caching libraries · Sub-millisecond operation times · Linear scaling with core count

🔧 Advanced Usage

Custom Cache Instances

from pyfcach import TTLCache, OptimizedLFUCache, ARCCache

# TTL Cache with background cleanup
ttl_cache = TTLCache(maxsize=1000, default_ttl=3600, cleanup_interval=30)

# LFU Cache for frequency-based access
lfu_cache = OptimizedLFUCache(maxsize=5000)

# ARC Cache for adaptive workloads
arc_cache = ARCCache(maxsize=10000)

Manual Cache Management

cache = HighPerformanceCache(maxsize=100)

# Basic operations
cache.set("key", "value", ttl=60)
value = cache.get("key")
deleted = cache.delete("key")
cache.clear()

# Bulk operations
for i in range(100):
    cache.set(f"key_{i}", f"value_{i}")

# Information and stats
info = cache.info()
print(f"Hit rate: {info.hits / (info.hits + info.misses):.2%}")

🤝 Contributing

We love contributions! Please see our Contributing Guide for details.

📄 License

This project is licensed under the Apache 2.0 License - see the LICENSE file for details.

🏆 Credits

PyFCach is created and maintained by Sarix. twine check dist/* Checking dist/pyfcach-0.3.15-py3-none-any.whl: FAILED ERROR long_description has syntax errors in markup and would not be rendered on PyPI. line 28: Warning: Title underline too short. Performance Optimizations ~~~~~~~~~~~~~~~~~~~~~~~~ Checking dist/pyfcach-0.3.15.tar.gz: FAILED ERROR long_description has syntax errors in markup and would not be rendered on PyPI. line 28: Warning: Title underline too short. Performance Optimizations ~~~~~~~~~~~~~~~~~~~~~~~~ /storage/emulated/0/Download/test ❯

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pyfcach-0.3.15.tar.gz (11.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

pyfcach-0.3.15-py3-none-any.whl (10.0 kB view details)

Uploaded Python 3

File details

Details for the file pyfcach-0.3.15.tar.gz.

File metadata

  • Download URL: pyfcach-0.3.15.tar.gz
  • Upload date:
  • Size: 11.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.12

File hashes

Hashes for pyfcach-0.3.15.tar.gz
Algorithm Hash digest
SHA256 19edcc63132ebca2491bb1b6bd08c8332f9e1ab0afb1b48c864efbd9fd3a60e0
MD5 9f0ada799fd5f7d52971dd92c78a01cb
BLAKE2b-256 d69bc0af4a2abfe4bcc9616224fe0b478c7170de9b2580f4165ef87fe1e9f6bf

See more details on using hashes here.

File details

Details for the file pyfcach-0.3.15-py3-none-any.whl.

File metadata

  • Download URL: pyfcach-0.3.15-py3-none-any.whl
  • Upload date:
  • Size: 10.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.12

File hashes

Hashes for pyfcach-0.3.15-py3-none-any.whl
Algorithm Hash digest
SHA256 b25368ce092d0b5c76ef4e2517cf6080f1ebc1e0c45341cbe3dcb2d7d03a1b9d
MD5 2bf15c73443e8f9036181e6849824499
BLAKE2b-256 75eeb59fabb75ff704e3b43dc2516d46de4a1b08c89b79a273c9892496dde4d5

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page