Skip to main content

AioCache DynamoDB Backend using Aiobotocore

Project description

aiocache-dynamodb

PyPI Python Versions Ruff Coverage Status pre-commit

aiocache-dynamodb is an asynchronous cache backend for DynamoDB, built on top of aiobotocore and aiocache. It provides a fully asynchronous interface for caching data using DynamoDB (+ S3), allowing for efficient and scalable caching solutions in Python applications.

Documentation.

For more information on aiobotocore:

Features

  • Fully asynchronous operations using aiobotocore.
  • TTL support for expiring cache items (even though DynamoDB only deletes items within 48hr of expiration, we double-check).
  • Batch operations for efficient multi-key handling.
  • Customizable key, value, and TTL column names.
  • S3 Extension for large object storage (+400kb)

Installation

Install the package using pip:

pip install aiocache-dynamodb

Usage

import asyncio
from aiocache_dynamodb import DynamoDBCache

async def main():
    cache = DynamoDBCache(
        table_name="my-cache-table",
        endpoint_url="http://localhost:4566",  # For local development
        aws_access_key_id="your-access-key",
        aws_secret_access_key="your-secret-key",
        region_name="us-east-1",
    )

    # Set a value with a TTL of 60 seconds
    await cache.set("my_key", "my_value", ttl=60)

    # Get the value
    value = await cache.get("my_key")
    print(value)  # Output: my_value

    # Delete the value
    await cache.delete("my_key")

    # Check if the key exists
    exists = await cache.exists("my_key")
    print(exists)  # Output: False

    # Close the cache
    await cache.close()

asyncio.run(main())

To use the S3 extension feature:

import asyncio
from aiocache_dynamodb import DynamoDBCache

async def main():
    cache = DynamoDBCache(
        table_name="my-cache-table",
        bucket_name="this-is-my-bucket",
        region_name="us-east-1",
    )

    large_value = "x" * 1024 * 400  # 400KB
    # Set a value with a TTL of 60 seconds
    # Deletion of item on S3 is not managed by the TTL
    # Please use lifecycle policies on the bucket
    await cache.set("my_key", large_value, ttl=60)

    # Get the value
    value = await cache.get("my_key")
    print(value)  # Output: large_value

    # Delete the value (both on dynamodb + S3)
    await cache.delete("my_key")

    # Close the cache
    await cache.close()

asyncio.run(main())

Configuration

The DynamoDBCache class supports the following parameters:

  • serializer: Serializer to use for serializing and deserializing values (default: aiocache.serializers.StringSerializer).
  • plugins: List of plugins to use (default: []).
  • namespace: Namespace to use for the cache (default: "").
  • timeout: Timeout for cache operations (default: 5).
  • table_name: Name of the DynamoDB table to use for caching.
  • bucket_name: Name of the S3 bucket to use for large object storage (default: None).
  • endpoint_url: Endpoint URL for DynamoDB (useful for LocalStack) (default: None).
  • region_name: AWS region (default: "us-east-1").
  • aws_access_key_id: AWS access key ID (default: None).
  • aws_secret_access_key: AWS secret access key (default: None).
  • key_column: Column name for the cache key (default: "cache_key").
  • value_column: Column name for the cache value (default: "cache_value").
  • ttl_column: Column name for the TTL (default: "ttl").
  • s3_key_column: Column name for the S3 key, only used if bucket_name is provided (default: "s3_key").
  • s3_client: Aiobotocore S3 client to use for large object storage, if not provided it'll be created lazily on first call or on __aenter__ (default: None).
  • dynamodb_client: Aiobotocore DynamoDB client to use, if not provided it'll be created lazily on first call or on __aenter__ (default: None).

Local Development:

We use make to handle the commands for the project, you can see the available commands by running this in the root directory:

make

Setup

To setup the project, you can run the following commands:

make dev

This will install the required dependencies for the project using uv + pip.

Linting

We use pre-commit to do linting locally, this will be included in the dev dependencies. We use ruff for linting and formatting, and pyright for static type checking. To install the pre-commit hooks, you can run the following command:

pre-commit install

If you for some reason hate pre-commit, you can run the following command to lint the code:

make check

Testing

To run tests, you can use the following command:

make test

In the background this will setup localstack to replicate the AWS services, and run the tests. It will also generate the coverage badge.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

aiocache_dynamodb-1.1.2.tar.gz (130.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

aiocache_dynamodb-1.1.2-py3-none-any.whl (13.1 kB view details)

Uploaded Python 3

File details

Details for the file aiocache_dynamodb-1.1.2.tar.gz.

File metadata

  • Download URL: aiocache_dynamodb-1.1.2.tar.gz
  • Upload date:
  • Size: 130.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: uv/0.8.18

File hashes

Hashes for aiocache_dynamodb-1.1.2.tar.gz
Algorithm Hash digest
SHA256 e4a1a1dfabcdae6365cdef7fde0fad94318d3d38d42a5b021c605f86c77bb82c
MD5 ed51aa2bc55cb3a9465626e68c233571
BLAKE2b-256 6323c17d435e460fd75fe56de8fb7a20c3542a9566e28a6e342a683dbb24edc3

See more details on using hashes here.

File details

Details for the file aiocache_dynamodb-1.1.2-py3-none-any.whl.

File metadata

File hashes

Hashes for aiocache_dynamodb-1.1.2-py3-none-any.whl
Algorithm Hash digest
SHA256 b33920cb1a75f47ade24a50aaa871536621a4602ae160fff8cca4de9931c3387
MD5 48aa324c4a0dfcc423e02e5d8dddf7b4
BLAKE2b-256 f5b181f7d53699b193046098e1f0946b929cbd150f34a304e47cce7f64667ea4

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page