Skip to main content

Python Rate-Limiter using Leaky-Bucket Algorimth Family

Project description

PyrateLimiter

The request rate limiter using Leaky-bucket algorithm

PyPI version Coverage Status Python 3.7 Python 3.8 Maintenance PyPI license HitCount


Introduction

This module can be used to apply rate-limit for API request. User defines window duration and the limit of function calls within such interval. To hold the state of the Bucket, you can use MemoryListBucket/MemoryQueueBucket as internal bucket. To use PyrateLimiter with Redis, redis-py is required to be installed. It is also possible to use your own Bucket implementation, by extending AbstractBucket from pyrate_limiter.core

Available modules

from pyrate_limiter import (
	BucketFullException,
	Duration,
	RequestRate,
	Limiter,
	MemoryListBucket,
	MemoryQueueBucket,
)

Strategies

Subscription strategies

Considering API throttling logic for usual business models of Subscription, we usually see strategies somewhat similar to these.

Some commercial/free API (Linkedin, Github etc)
- 500 requests/hour, and
- 1000 requests/day, and
- maximum 10,000 requests/month
  • RequestRate class is designed to describe this strategies - eg for the above strategies we have a Rate-Limiter defined as following
hourly_rate = RequestRate(500, Duration.HOUR) # maximum 500 requests/hour
daily_rate = RequestRate(1000, Duration.DAY) # maximum 1000 requests/day
monthly_rate = RequestRate(10000, Duration.MONTH) # and so on

limiter = Limiter(hourly_rate, daily_rate, monthly_rate, *other_rates, bucket_class=MemoryListBucket) # default is MemoryQueueBucket

# usage
identity = user_id # or ip-address, or maybe both
limiter.try_acquire(identity)

As the logic is pretty self-explainatory, note that the superior rate-limit must come after the inferiors, ie 1000 req/day must be declared after an hourly-rate-limit, and the daily-limit must be larger than hourly-limit.

  • bucket_class is the type of bucket that holds request. It could be an in-memory data structure like Python List (MemoryListBucket), or Queue MemoryQueueBucket.

  • For microservices or decentralized platform, multiple rate-Limiter may share a single store for storing request-rate history, ie Redis. This lib provides a ready-use RedisBucket to handle such case, and required redis-py as its peer-dependency. The usage difference is when using Redis, a naming prefix must be provide so the keys can be distinct for each item's identity.

from redis import ConnectionPool

pool = ConnectionPool.from_url('redis://localhost:6379')

rate = RequestRate(3, 5 * Duration.SECOND)

bucket_kwargs = {
	"redis_pool": redis_pool,
	"bucket_name": "my-ultimate-bucket-prefix"
}

# so each item buckets will have a key name as
# my-ultimate-bucket-prefix__item-identity

limiter = Limiter(rate, bucket_class=RedisBucket, bucket_kwargs=bucket_kwargs)
item = 'vutran_item'
limiter.try_acquire(item)

BucketFullException

If the Bucket is full, an exception BucketFullException will be raised, with meta-info about the identity it received, the rate that has raised, and the remaining time until the next request can be processed.

rate = RequestRate(3, 5 * Duration.SECOND)
limiter = Limiter(rate)
item = 'vutran'

has_raised = False
try:
	for _ in range(4):
		limiter.try_acquire(item)
		sleep(1)
except BucketFullException as err:
	has_raised = True
	assert str(err)
	# Bucket for vutran with Rate 3/5 is already full
	assert isinstance(err.meta_info, dict)
	# {'error': 'Bucket for vutran with Rate 3/5 is already full', 'identity': 'tranvu', 'rate': '5/5', 'remaining_time': 2}
  • *RequestRate may be required to reset on a fixed schedule, eg: every first-day of a month

Spam-protection strategies

  • Sometimes, we need a rate-limiter to protect our API from spamming/ddos attack. Some usual strategies for this could be as following
1. No more than 100 requests/minute, or
2. 100 request per minute, and no more than 300 request per hour

Throttling handling

When the number of incoming requets go beyond the limit, we can either do..

1. Raise a 429 Http Error, or
2. Keep the incoming requests, wait then slowly process them one by one.

More complex scenario

https://www.keycdn.com/support/rate-limiting#types-of-rate-limits

  • *Sometimes, we may need to apply specific rate-limiting strategies based on schedules/region or some other metrics. It requires the capability to switch the strategies instantly without re-deploying the whole service.

Notes

Todo-items marked with (*) are planned for v3 release.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pyrate-limiter-2.1.0.tar.gz (7.9 kB view details)

Uploaded Source

Built Distribution

pyrate_limiter-2.1.0-py3-none-any.whl (8.2 kB view details)

Uploaded Python 3

File details

Details for the file pyrate-limiter-2.1.0.tar.gz.

File metadata

  • Download URL: pyrate-limiter-2.1.0.tar.gz
  • Upload date:
  • Size: 7.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.0.10 CPython/3.7.5 Darwin/20.3.0

File hashes

Hashes for pyrate-limiter-2.1.0.tar.gz
Algorithm Hash digest
SHA256 2555fbefc475e4886e041d200cd9602a587aba57be6e497f3d62460dcd056024
MD5 67ce2ad6ad861726b243b68573f1fc10
BLAKE2b-256 d9ec303783ce97626078ccb963bb9a36e218e58ac8ddbd36df4c7e95dc639fea

See more details on using hashes here.

File details

Details for the file pyrate_limiter-2.1.0-py3-none-any.whl.

File metadata

  • Download URL: pyrate_limiter-2.1.0-py3-none-any.whl
  • Upload date:
  • Size: 8.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.0.10 CPython/3.7.5 Darwin/20.3.0

File hashes

Hashes for pyrate_limiter-2.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 b6908b0ca8c718668fdb402e9a6562a3a7098a27f2222888a20fa835b819fd2f
MD5 0c7d3b0032a27466f15eaa8c2bcca45c
BLAKE2b-256 3c83ab26843ea4600fecb9a989699ab3fd4f1bf5988ad36c7cdc9f07c14d3e1f

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page