Skip to main content

Pythonic-cache library for caching, which has everything you need out of the box and nothing extra.

Project description

Pythonic-cache library for caching, which has everything you need out of the box and nothing extra.

Документация на русском

Installation

pip install pythonic-cache

Basic Usage

from datetime import timedelta

from pythonic_cache import CacheClient
from pythonic_cache.storage.memory import MemoryCacheStorage

client = CacheClient(MemoryCacheStorage())


@client.cache(timedelta(minutes=5))
def calculate_sum(a: int, b: int) -> int:
    print("Calculating sum...")
    return a + b


def main() -> None:
    # First function call, the result will be computed and cached
    result1 = calculate_sum(5, 3)
    print("Result 1:", result1)

    # Subsequent function call with the same arguments, the result will be fetched from the cache
    result2 = calculate_sum(5, 3)
    print("Result 2:", result2)

    # Function call with different arguments, the result will be computed again and cached
    result3 = calculate_sum(10, 20)
    print("Result 3:", result3)


if __name__ == "__main__":
    main()

Besides CacheClient, there is AsyncCacheClient, which operates similarly but with coroutines. It takes objects of the base class AsyncCacheStorage as input in its initializer. Out of the box, there are two subclasses: AsyncMemoryCacheStorage and AsyncRedisCacheStorage.

Example Usage

import asyncio
from datetime import timedelta

from redis.asyncio import Redis
from pythonic_cache import AsyncCacheClient
from pythonic_cache.storage.redis import AsyncRedisCacheStorage

# Connecting to Redis
redis = Redis(host='localhost', port=6379, db=0)

# Creating a cache client with Redis as the asynchronous storage
async_redis_storage = AsyncRedisCacheStorage(redis)

# Creating an asynchronous cache client
async_client = AsyncCacheClient(async_redis_storage)


# Asynchronous function, whose results will be cached
@async_client.cache(timedelta(minutes=5))
async def async_factorial(x: int) -> int:
    if x <= 1:
        return 1

    return await async_factorial(x - 1) * x


async def main() -> None:
    # First call to the asynchronous function, the result will be computed and cached
    result1 = await async_factorial(5)
    print("Result 1:", result1)

    # Subsequent call to the asynchronous function with the same arguments, the result will be fetched from the cache
    result2 = await async_factorial(5)
    print("Result 2:", result2)

    # Call to the asynchronous function with different arguments, the result will be computed again and cached
    result3 = await async_factorial(10)
    print("Result 3:", result3)


if __name__ == "__main__":
    # Running the asynchronous main function
    asyncio.run(main())

The main feature of the library is its integration layer, which contains functions for integrating pythonic-cache with external frameworks.

Example of integration with FastAPI

from datetime import timedelta
from fastapi import FastAPI

from pythonic_cache.integrations.fastapi import cache, setup_cache
from pythonic_cache.storage.memory import AsyncMemoryCacheStorage

app = FastAPI()

# Setting up cache for FastAPI application
setup_cache(app, AsyncMemoryCacheStorage())

@app.get("/sum/")
@cache(expires=timedelta(hours=1))
async def sum_endpoint(a: int, b: int) -> int:
    # The result will be cached for 1 hour
    return a + b

License

See the LICENSE file for license rights and limitations (MIT).

Issues

If you encounter any issues with the project or have suggestions for improvement, please feel free to report them by creating an issue on the GitHub repository. We welcome your feedback!

Pull Requests

We welcome community contributions! If you'd like to contribute to this project, please follow these steps:

  1. Fork the repository.
  2. Create a new branch for your feature or bug fix.
  3. Make your changes and ensure they are properly tested.
  4. Propose a pull request (PR) to the develop branch of the original repository.
  5. Provide a detailed description of your changes in the PR description.

We appreciate your contributions!

Contact

If you have any questions, suggestions, or feedback regarding this project, feel free to contact the author:

Stay connected!

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pythonic_cache-0.3.0.tar.gz (8.6 kB view details)

Uploaded Source

Built Distribution

pythonic_cache-0.3.0-py3-none-any.whl (8.8 kB view details)

Uploaded Python 3

File details

Details for the file pythonic_cache-0.3.0.tar.gz.

File metadata

  • Download URL: pythonic_cache-0.3.0.tar.gz
  • Upload date:
  • Size: 8.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.0.0 CPython/3.12.2

File hashes

Hashes for pythonic_cache-0.3.0.tar.gz
Algorithm Hash digest
SHA256 f1e83dd9684ce7a904639c5708f30751199eef8a05b4fae42d3062be29c1e2ee
MD5 0a4afe7ab5f2469d53399035061a8be3
BLAKE2b-256 68fd0e913128060aa9444070d821326cc925e6e44023eca8549283a1887cd786

See more details on using hashes here.

File details

Details for the file pythonic_cache-0.3.0-py3-none-any.whl.

File metadata

File hashes

Hashes for pythonic_cache-0.3.0-py3-none-any.whl
Algorithm Hash digest
SHA256 44a18b693a30fa1d98c92eb2ace100f21bf4d0902def394d5f9f8862fe0c62e1
MD5 89b96957da053bd1320ba6586cd4918c
BLAKE2b-256 8bfe7515cf31a7cfd9056fe6f27b5fea255c8bbb574f03b4e7a73807b20ec6eb

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page