Skip to main content

Global throttling with Redis.

Project description

PyPI pyversions CircleCI

Redis GT

Global throttling with Redis.

installation

pip install redis-gt

usage

ex) request with throttle

import redis_gt
import requests

redis_gt.Defaults.redis_url = 'redis://127.0.0.1:6379'

def get_members(**kwargs):
    throttle = redis_gt.Throttle('backend_api_call', 20)
    return throttle.run(requests.get, '/api/members/', **kwargs)

ex) same with asyncio

import asyncio
import redis_gt
import requests

redis_gt.Defaults.redis_url = 'redis://127.0.0.1:6379'

async def get_members(**kwargs):
    def _get():
        return requests.get('/api/members/', **kwargs)
    loop = asyncio.get_event_loop()
    throttle = redis_gt.AsyncThrottle('backend_api_call', 20)
    return await throttle.run(loop.run_in_executer(_get))

throttling

import redis_gt
import time
from threading import Thread

redis_gt.Defaults.redis_url = 'redis://127.0.0.1:6379'

def do_something():
    time.sleep(1.0)

def do_something_with_throttle():
    # max 10 parallels
    throttle = Throttle(r, 'something', 10)
    throttle.run(do_something)

# 100 tasks of each 1sec.
threads = [Thread(target=do_something_with_throttle) for _ in range(100)]
for t in threads:
    t.start()

# takes about 10secs (100tasks/10para).
for t in threads:
    t.join()

throttling with asyncio

import asyncio
from redis_gt import AsyncThrottle
import redis

async def do_something():
     await asyncio.sleep(1.0)

r = redis.StrictRedis.from_url('redis://127.0.0.1:6379')

# max 10 parallels
throttle = AsyncThrottle(r, 'something', 10)

# 100 tasks of each 1sec.
tasks = [throttle.run(do_something()) for _ in range(100)]
loop = asyncio.get_event_loop()

# takes about 10secs (100tasks/10para).
loop.run_until_complete(asyncio.wait(tasks))

decorator

from redis_gt import Throttle
from redis_gt.decorators import throttle
import redis

r = redis.StrictRedis.from_url('redis://127.0.0.1:6379')

# for sync function

@throttle('something', 10, redis=r)
def do_something():
    time.sleep(1.0)

# 100 tasks of each 1sec.
threads = [Thread(target=do_something) for _ in range(100)]
for t in threads:
    t.start()

# takes about 10secs (100tasks/10para).
for t in threads:
    t.join()

# for async function
@throttle('something', 10, redis=r)
async def do_something_async():
     await asyncio.sleep(1.0)

# 100 tasks of each 1sec.
tasks = [do_something_async() for _ in range(100)]
loop = asyncio.get_event_loop()

# takes about 10secs (100tasks/10para).
loop.run_until_complete(asyncio.wait(tasks))

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

redis_gt-1.2.0.tar.gz (3.5 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

redis_gt-1.2.0-py3-none-any.whl (5.9 kB view details)

Uploaded Python 3

File details

Details for the file redis_gt-1.2.0.tar.gz.

File metadata

  • Download URL: redis_gt-1.2.0.tar.gz
  • Upload date:
  • Size: 3.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.12.1 pkginfo/1.4.2 requests/2.20.1 setuptools/40.5.0 requests-toolbelt/0.8.0 tqdm/4.28.1 CPython/3.7.0

File hashes

Hashes for redis_gt-1.2.0.tar.gz
Algorithm Hash digest
SHA256 15a05af278fcf110ca09a865b6679be966be6952f9f12e4fafed30883406f759
MD5 fb5c104c81271ea226a827da104da902
BLAKE2b-256 63c0a8c3277e57c6cb4592f70879195dd047ec9b6f7e664d7401ad65b1bc8a73

See more details on using hashes here.

File details

Details for the file redis_gt-1.2.0-py3-none-any.whl.

File metadata

  • Download URL: redis_gt-1.2.0-py3-none-any.whl
  • Upload date:
  • Size: 5.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.12.1 pkginfo/1.4.2 requests/2.20.1 setuptools/40.5.0 requests-toolbelt/0.8.0 tqdm/4.28.1 CPython/3.7.0

File hashes

Hashes for redis_gt-1.2.0-py3-none-any.whl
Algorithm Hash digest
SHA256 cd10ffb922e48870c90c26b9c59057ff8110c4132d51eac7ecdd13224fbd9cae
MD5 8dfa6391ab64b65f81cf49bc92346069
BLAKE2b-256 1e417d9175d88ac021793b9d2b5dd819299464d78f540f3fd74277b23186fe4d

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page