Skip to main content

AioCache DynamoDB Backend using Aiobotocore

Project description

aiocache-dynamodb

PyPI Python Versions Ruff Coverage Status pre-commit

aiocache-dynamodb is an asynchronous cache backend for DynamoDB, built on top of aiobotocore and aiocache. It provides a fully asynchronous interface for caching data using DynamoDB (+ S3), allowing for efficient and scalable caching solutions in Python applications.

Documentation.

For more information on aiobotocore:

Features

  • Fully asynchronous operations using aiobotocore.
  • TTL support for expiring cache items (even though DynamoDB only deletes items within 48hr of expiration, we double-check).
  • Batch operations for efficient multi-key handling.
  • Customizable key, value, and TTL column names.
  • S3 Extension for large object storage (+400kb)

Installation

Install the package using pip:

pip install aiocache-dynamodb

Usage

import asyncio
from aiocache_dynamodb import DynamoDBCache

async def main():
    cache = DynamoDBCache(
        table_name="my-cache-table",
        endpoint_url="http://localhost:4566",  # For local development
        aws_access_key_id="your-access-key",
        aws_secret_access_key="your-secret-key",
        region_name="us-east-1",
    )

    # Set a value with a TTL of 60 seconds
    await cache.set("my_key", "my_value", ttl=60)

    # Get the value
    value = await cache.get("my_key")
    print(value)  # Output: my_value

    # Delete the value
    await cache.delete("my_key")

    # Check if the key exists
    exists = await cache.exists("my_key")
    print(exists)  # Output: False

    # Close the cache
    await cache.close()

asyncio.run(main())

To use the S3 extension feature:

import asyncio
from aiocache_dynamodb import DynamoDBCache

async def main():
    cache = DynamoDBCache(
        table_name="my-cache-table",
        bucket_name="this-is-my-bucket",
        region_name="us-east-1",
    )

    large_value = "x" * 1024 * 400  # 400KB
    # Set a value with a TTL of 60 seconds
    # Deletion of item on S3 is not managed by the TTL
    # Please use lifecycle policies on the bucket
    await cache.set("my_key", large_value, ttl=60)

    # Get the value
    value = await cache.get("my_key")
    print(value)  # Output: large_value

    # Delete the value (both on dynamodb + S3)
    await cache.delete("my_key")

    # Close the cache
    await cache.close()

asyncio.run(main())

Configuration

The DynamoDBCache class supports the following parameters:

  • serializer: Serializer to use for serializing and deserializing values (default: aiocache.serializers.StringSerializer).
  • plugins: List of plugins to use (default: []).
  • namespace: Namespace to use for the cache (default: "").
  • timeout: Timeout for cache operations (default: 5).
  • table_name: Name of the DynamoDB table to use for caching.
  • bucket_name: Name of the S3 bucket to use for large object storage (default: None).
  • endpoint_url: Endpoint URL for DynamoDB (useful for LocalStack) (default: None).
  • region_name: AWS region (default: "us-east-1").
  • aws_access_key_id: AWS access key ID (default: None).
  • aws_secret_access_key: AWS secret access key (default: None).
  • key_column: Column name for the cache key (default: "cache_key").
  • value_column: Column name for the cache value (default: "cache_value").
  • ttl_column: Column name for the TTL (default: "ttl").
  • s3_key_column: Column name for the S3 key, only used if bucket_name is provided (default: "s3_key").
  • s3_client: Aiobotocore S3 client to use for large object storage, if not provided it'll be created lazily on first call or on __aenter__ (default: None).
  • dynamodb_client: Aiobotocore DynamoDB client to use, if not provided it'll be created lazily on first call or on __aenter__ (default: None).

Local Development:

We use make to handle the commands for the project, you can see the available commands by running this in the root directory:

make

Setup

To setup the project, you can run the following commands:

make dev

This will install the required dependencies for the project using uv + pip.

Linting

We use pre-commit to do linting locally, this will be included in the dev dependencies. We use ruff for linting and formatting, and pyright for static type checking. To install the pre-commit hooks, you can run the following command:

pre-commit install

If you for some reason hate pre-commit, you can run the following command to lint the code:

make check

Testing

To run tests, you can use the following command:

make test

In the background this will setup localstack to replicate the AWS services, and run the tests. It will also generate the coverage badge.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

aiocache_dynamodb-1.0.2.tar.gz (129.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

aiocache_dynamodb-1.0.2-py3-none-any.whl (12.7 kB view details)

Uploaded Python 3

File details

Details for the file aiocache_dynamodb-1.0.2.tar.gz.

File metadata

  • Download URL: aiocache_dynamodb-1.0.2.tar.gz
  • Upload date:
  • Size: 129.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: uv/0.7.2

File hashes

Hashes for aiocache_dynamodb-1.0.2.tar.gz
Algorithm Hash digest
SHA256 3c33923b3a79b09f5650cd8a1e56b796fef7d5678d0100b97243a271b881788a
MD5 039f46144cfd9415ebeea37c5c421651
BLAKE2b-256 c5b63e9517f3e125203719b80a3f8d1d1e4fbaeba1a064677f4cc6148f30afa9

See more details on using hashes here.

File details

Details for the file aiocache_dynamodb-1.0.2-py3-none-any.whl.

File metadata

File hashes

Hashes for aiocache_dynamodb-1.0.2-py3-none-any.whl
Algorithm Hash digest
SHA256 20fe19042c5cf3b4a18b0e5f5b37659c572b4b6f1fc8247f65784d130c9d0f34
MD5 9b01dc82220f74e40ac1f7451620e390
BLAKE2b-256 1e1569ec86a4f7edd0f0a7c2a29405bea57f923f0535a0fff5bf3179d4c68bfb

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page