Skip to main content

Python rate limiters backed by Redis

Project description

Python Redis Limiters

A library which regulates traffic, with respect to concurrency or time. It implements sync and async context managers for a semaphore- and a token bucket-implementation.

The rate limiters are distributed, using Redis, and leverages Lua scripts to improve performance and simplify the code. Lua scripts run on Redis, and make each implementation fully atomic, while also reducing the number of round-trips required.

Use is supported for standalone redis instances, and clusters. We currently only support Python 3.11, but can add support for older versions if needed.

NOTE:

This project was initially forked from redis-rate-limiters and was mainly created by Sondre Lillebø Gundersen link.

The old project is no longer being worked on and only supported PydanticV1. I plan to add more functionality as well as maintain this fork in the future. It will be published under py-redis-limiters.

Currently I:

  • migrated to PydanticV2
  • migrated from poetry to uv
  • migrated from just to mise en place
  • changed the pre-commit & build process a bit (e.g. remove black/isort in favor of ruff)
  • tidied up a few types as well as add types to tests
  • added a few more tests (I plan to add more)
  • added default values to the rate limits.

Note: The README is currently outdated - I will update it later, for now check the releases page.

Installation

pip install py-redis-limiters

Usage

Semaphore

The semaphore classes are useful when you have concurrency restrictions; e.g., say you're allowed 5 active requests at the time for a given API token.

Beware that the client will block until the Semaphore is acquired, or the max_sleep limit is exceeded. If the max_sleep limit is exceeded, a MaxSleepExceededError is raised. Setting max_sleep to 0.0 will sleep "endlessly" - this is also the default value.

Here's how you might use the async version:

import asyncio

from httpx import AsyncClient
from redis.asyncio import Redis

from limiters import AsyncSemaphore

# Every property besides name has a default like below
limiter = AsyncSemaphore(
    name="foo",    # name of the resource you are limiting traffic for
    capacity=5,    # allow 5 concurrent requests
    max_sleep=30,  # raise an error if it takes longer than 30 seconds to acquire the semaphore
    expiry=30,      # set expiry on the semaphore keys in Redis to prevent deadlocks
    connection=Redis.from_url("redis://localhost:6379"),
)

async def get_foo():
    async with AsyncClient() as client:
        async with limiter:
            client.get(...)


async def main():
    await asyncio.gather(
        get_foo() for i in range(100)
    )

and here is how you might use the sync version:

import requests
from redis import Redis

from limiters import SyncSemaphore


limiter = SyncSemaphore(
    name="foo",
    capacity=5,
    max_sleep=30,
    expiry=30,
    connection=Redis.from_url("redis://localhost:6379"),
)

def main():
    with limiter:
        requests.get(...)

Token bucket

The TocketBucket classes are useful if you're working with time-based rate limits. Say, you are allowed 100 requests per minute, for a given API token.

If the max_sleep limit is exceeded, a MaxSleepExceededError is raised. Setting max_sleep to 0.0 will sleep "endlessly" - this is also the default value.

Here's how you might use the async version:

import asyncio

from httpx import AsyncClient
from redis.asyncio import Redis

from limiters import AsyncTokenBucket

# Every property besides name has a default like below
limiter = AsyncTokenBucket(
    name="foo",          # name of the resource you are limiting traffic for
    capacity=5,          # hold up to 5 tokens
    refill_frequency=1,  # add tokens every second
    refill_amount=1,     # add 1 token when refilling
    max_sleep=0,         # raise an error if there are no free tokens for X seconds, 0 never expires
    connection=Redis.from_url("redis://localhost:6379"),
)

async def get_foo():
    async with AsyncClient() as client:
        async with limiter:
            client.get(...)

async def main():
    await asyncio.gather(
        get_foo() for i in range(100)
    )

and here is how you might use the sync version:

import requests
from redis import Redis

from limiters import SyncTokenBucket


limiter = SyncTokenBucket(
    name="foo",
    capacity=5,
    refill_frequency=1,
    refill_amount=1,
    max_sleep=0,
    connection=Redis.from_url("redis://localhost:6379"),
)

def main():
    with limiter:
        requests.get(...)

Using them as a decorator

We don't ship decorators in the package, but if you would like to limit the rate at which a whole function is run, you can create your own, like this:

from limiters import AsyncSemaphore


# Define a decorator function
def limit(name, capacity):
  def middle(f):
    async def inner(*args, **kwargs):
      async with AsyncSemaphore(name=name, capacity=capacity):
        return await f(*args, **kwargs)
    return inner
  return middle


# Then pass the relevant limiter arguments like this
@limit(name="foo", capacity=5)
def fetch_foo(id: UUID) -> Foo:

Contributing

Contributions are very welcome. Here's how to get started:

  • Clone the repo
  • Install uv
  • Run pre-commit install to set up pre-commit
  • Install just and run just setup If you prefer not to install just, just take a look at the justfile and run the commands yourself.
  • Make your code changes, with tests
  • Commit your changes and open a PR

Publishing a new version

To publish a new version:

  • Update the package version in the pyproject.toml
  • Open Github releases
  • Press "Draft a new release"
  • Set a tag matching the new version (for example, v0.1.0)
  • Set the title matching the tag
  • Add some release notes, explaining what has changed
  • Publish

Once the release is published, our publish workflow should be triggered to push the new version to PyPI.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

redis_limiters-0.7.0.tar.gz (8.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

redis_limiters-0.7.0-py3-none-any.whl (10.9 kB view details)

Uploaded Python 3

File details

Details for the file redis_limiters-0.7.0.tar.gz.

File metadata

  • Download URL: redis_limiters-0.7.0.tar.gz
  • Upload date:
  • Size: 8.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: uv/0.8.15

File hashes

Hashes for redis_limiters-0.7.0.tar.gz
Algorithm Hash digest
SHA256 1855ac341ac549028aa8a2c8a1e3c58b6675d273bbaed6af3337225c29de8fea
MD5 65355f4465415c875996034fbfd519e4
BLAKE2b-256 ea562cceb9b760bc1b653d4d839ba40d5f29b3fea9ece33b27000fd4126834ae

See more details on using hashes here.

File details

Details for the file redis_limiters-0.7.0-py3-none-any.whl.

File metadata

File hashes

Hashes for redis_limiters-0.7.0-py3-none-any.whl
Algorithm Hash digest
SHA256 ba28df3545c91d2ac839294b6211e5539f8138782fefb8b7bba3c766e1964d86
MD5 cbf61d5fda965048135d66ee9db21175
BLAKE2b-256 fc0acbe73b882be014d8752811fa25f945ad805afa80a4dd57bd040cd914c712

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page