Skip to main content

Distributed locks, caching, and locking at hand

Project description

header License PyPI version PyPI - Downloads supported versions Coverage Status

Pre-build checks and Tests Tests Status Documentation Status Reliability Rating


Documentation: https://py-cachify.readthedocs.io/latest/

Source Code: https://github.com/EzyGang/py-cachify

FastAPI Integration Guide: Repo


Py-Cachify is a robust library tailored for developers looking to enhance their Python applications with elegant caching and locking mechanisms. Whether you're building synchronous or asynchronous applications, Py-Cachify has you covered! It acts as a thin, backend-agnostic wrapper over your favorite cache client, letting you focus on business logic instead of juggling low-level get/set calls.

Key Features:

  • Flexible Caching: Effortlessly cache your function results, dramatically reducing execution time for expensive computations and I/O-bound tasks. Utilize customizable keys and time-to-live (TTL) parameters.

  • Distributed Locks: Ensure safe concurrent operation of functions with distributed locks. Prevent race conditions and manage shared resources effectively across both sync and async contexts.

  • Backend Agnostic: Easily integrate with different cache backends. Choose between in-memory, Redis, DragonflyDB, or any custom backend that adheres to the provided client interfaces.

  • Decorators for Ease: Use intuitive decorators like @cached() and @lock() to wrap your functions, maintain clean code, and benefit from automatic cache management.

  • Type Safety & Documentation: Fully type-annotated for enhanced IDE support and readability, featuring comprehensive documentation and examples to guide you through various use cases.

  • Production Ready: With 100% test coverage and usage in multiple commercial projects, Py-Cachify is trusted for production environments, ensuring reliability and stability for your applications.


Table of Contents

Installation

$ pip install py-cachify

---> 100%
Successfully installed py-cachify

How to use

You can read more in-depth tutorials here.

First, to start working with the library, you will have to initialize it by using the provided init_cachify function:

from py_cachify import init_cachify

init_cachify()

This call:

  • Configures the global client used by the top-level decorators: cached, lock, and once.
  • Returns a Cachify instance, but you don't have to use it if you only work with the global decorators.
  • Uses an in-memory cache by default (both for sync and async usage).

If you want to use Redis:

from py_cachify import init_cachify
from redis.asyncio import from_url as async_from_url
from redis import from_url


# Example: configure global cachify with Redis for both sync and async flows
init_cachify(
    sync_client=from_url(redis_url),
    async_client=async_from_url(redis_url),
)

Normally you wouldn't have to use both sync and async clients since an application usually works in a single mode i.e. sync/async. You can pass only sync_client or only async_client if that matches your usage.

Once initialized you can use everything that the library provides straight up without being worried about managing the cache yourself.

❗ If you forgot to call init_cachify with is_global=True at least once, using the global decorators (cached, lock, once) will raise CachifyInitError during runtime.

You can also create dedicated instances without touching the global client:

from py_cachify import init_cachify

# Global initialization for the top-level decorators
init_cachify()

# Local instance that does NOT touch the global client
local_cache = init_cachify(is_global=False, prefix='LOCAL-')

@local_cache.cached(key='local-{x}')
def compute_local(x: int) -> int:
    return x * 2

Basic examples

Caching

Caching by using @cached decorator utilizing the flexibility of a dynamic key:

# Cache the result of the following function with dynamic key
@cached(key='sum_two-{a}-{b}')
async def sum_two(a: int, b: int) -> int:
    # Let's put print here to see what was the function called with
    print(f'Called with {a} {b}')
    return a + b
    
    
# Reset the cache for the call with arguments a=1, b=2
await sub_two.reset(a=1, b=2)

Multi-layer Usage

It is possible to layer caches by stacking cached decorators (for example, a global cache inside a local instance cache).

from py_cachify import cached, init_cachify

# Global initialization for the top-level decorators
init_cachify()

# Local instance with a shorter TTL that wraps the global one
local = init_cachify(is_global=False, prefix='LOCAL-')

@local.cached(key='local-expensive-{x}', ttl=5)
@cached(key='expensive-{x}', ttl=60)
def expensive(x: int) -> int:
    return x * 10

Read more about @cached here.

Locking

Locking through context manager:

from py_cachify import lock


async_lock = lock('resource_key')
# Use it within an asynchronous context
async with async_lock:
    # Your critical section here
    print('Critical section code')

# Check if it's locked
await async_lock.is_alocked()

# Forcefully release
await async_lock.arelease()

# Use it within a synchronous context
with lock('resource_key'):
    # Your critical section here
    print('Critical section code')

Locking via decorator:

from py_cachify import lock

@lock(key='critical_function_lock-{arg}', nowait=False, timeout=10)
async def critical_function(arg: int) -> None:
    # critical code
    

# Check if it's locked for arg=5
await critical_function.is_locked(arg=5)

# Forcefully release for arg=5
await critical_function.release(arg=5)

Read more about lock here.

For a more detailed tutorial visit Tutorial or full API reference.

Contributing

If you'd like to contribute, please first discuss the changes using Issues, and then don't hesitate to shoot a PR which will be reviewed shortly.

License

This project is licensed under the MIT License - see the LICENSE file for details.

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

py_cachify-3.0.1.tar.gz (15.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

py_cachify-3.0.1-py3-none-any.whl (21.2 kB view details)

Uploaded Python 3

File details

Details for the file py_cachify-3.0.1.tar.gz.

File metadata

  • Download URL: py_cachify-3.0.1.tar.gz
  • Upload date:
  • Size: 15.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for py_cachify-3.0.1.tar.gz
Algorithm Hash digest
SHA256 b3c6f8695a322589824b1a8482e753c3a18382a205a6d853d221676ab520628c
MD5 c71274a44c726aca58ffdf411352e6e7
BLAKE2b-256 defba40851f932cddee23e475b1c968b6337ed5bbf2d97a2f07df5866a7bdee3

See more details on using hashes here.

Provenance

The following attestation bundles were made for py_cachify-3.0.1.tar.gz:

Publisher: build-and-publish.yml on EzyGang/py-cachify

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file py_cachify-3.0.1-py3-none-any.whl.

File metadata

  • Download URL: py_cachify-3.0.1-py3-none-any.whl
  • Upload date:
  • Size: 21.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for py_cachify-3.0.1-py3-none-any.whl
Algorithm Hash digest
SHA256 6eba5797d0e57b1c54bfcbdf45b4054afabd9b3c8d8838bf8e42a78d4893f70e
MD5 108dc18fe6ab14bcfd4348b1491fafd2
BLAKE2b-256 48b192e8cdb4a9860c6a8e0d5a7d96ab268566e4072fb872959b1f6c578bf857

See more details on using hashes here.

Provenance

The following attestation bundles were made for py_cachify-3.0.1-py3-none-any.whl:

Publisher: build-and-publish.yml on EzyGang/py-cachify

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page