Skip to main content

Reducer for similar simultaneously coroutines

Project description

Python versions PyPI version

About Async-Reduce

async_reduce(coroutine) allows aggregate all similar simultaneous ready to run coroutines and reduce to running only one coroutine. Other aggregated coroutines will get result from single coroutine.

It can boost application performance in highly competitive execution of the similar asynchronous operations and reduce load for inner systems.

Quick example

from async_reduce import async_reduce


async def fetch_user_data(user_id: int) -> dict:
    """" Get user data from inner service """
    url = 'http://inner-service/user/{}'.format(user_id)

    return await http.get(url, timeout=10).json()


@web_server.router('/users/(\d+)')
async def handler_user_detail(request, user_id: int):
    """ Handler for get detail information about user """

    # all simultaneous requests of fetching user data for `user_id` will
    # reduced to single request
    user_data = await async_reduce(
        fetch_user_data(user_id)
    )

    # sometimes ``async_reduce`` cannot detect similar coroutines and
    # you should provide special argument `ident` for manually determination
    user_statistics = await async_reduce(
        DataBase.query('user_statistics').where(id=user_id).fetch_one(),
        ident='db_user_statistics:{}'.format(user_id)
    )

    return Response(...)

In that example without using async_reduce if client performs N simultaneous requests like GET http://web_server/users/42 web_server performs N requests to inner-service and N queries to database. In total: N simultaneous requests emits 2 * N requests to inner systems.

With async_reduce if client performs N simultaneous requests web_server performs one request to inner-service and one query to database. In total: N simultaneous requests emit only 2 requests to inner systems.

See other real examples.

Similar coroutines determination

async_reduce(coroutine) tries to detect similar coroutines by hashing local variables bounded on call. It does not work correctly if:

  • one of the arguments is not hashable
  • coroutine function is a method of class with specific state (like ORM)
  • coroutine function has closure to unhashable variable

You can disable auto-determination by setting custom key to argument ident.

Use as decorator

Also library provide special decorator @async_reduceable(), example:

from async_reduce import async_reduceable


@async_reduceable()
async def fetch_user_data(user_id: int) -> dict:
    """" Get user data from inner service """
    url = 'http://inner-servicce/user/{}'.format(user_id)

    return await http.get(url, timeout=10).json()


@web_server.router('/users/(\d+)')
async def handler_user_detail(request, user_id: int):
    """ Handler for get detail information about user """
    return await fetch_user_data(user_id)

Hooks

Library supports hooks. Add-on hooks:

  • DebugHooks - print about all triggered hooks
  • StatisticsOverallHooks - general statistics on the use of async_reduce
  • StatisticsDetailHooks - like StatisticsOverallHooks but detail statistics about all coroutine processed by async_reduce

Example:

from async_reduce import AsyncReducer
from async_reduce.hooks import DebugHooks

# define custom async_reduce with hooks
async_reduce = AsyncReducer(hooks=DebugHooks())


async def handler_user_detail(request, user_id: int):
    user_data = await async_reduce(fetch_user_data(user_id))

See more detail example in examples/example_hooks.py.

You can write custom hooks via inherit from BaseHooks.

Caveats

  • If single coroutine raises exceptions all aggregated coroutines will get same exception too

  • If single coroutine is stuck all aggregated coroutines will stuck too. Limit execution time for coroutine and add retries (optional) to avoid it.

  • Be careful when return mutable value from coroutine because single value will shared. Prefer to use non-mutable value as coroutine return.

Development

See DEVELOPMENT.md.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

async_reduce-1.3.tar.gz (13.0 kB view details)

Uploaded Source

Built Distribution

async_reduce-1.3-py3-none-any.whl (12.2 kB view details)

Uploaded Python 3

File details

Details for the file async_reduce-1.3.tar.gz.

File metadata

  • Download URL: async_reduce-1.3.tar.gz
  • Upload date:
  • Size: 13.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.12.0

File hashes

Hashes for async_reduce-1.3.tar.gz
Algorithm Hash digest
SHA256 9f2811d1e13f333d4f7f6801e3993ffe1b6082503e272e28189677431729d3bb
MD5 24f3f40320f606e9925b4b3fcd3e1a85
BLAKE2b-256 39c63b183a249e2085f3c0c6ffe71711933b28879b72de2f5fe08be6ed290cdd

See more details on using hashes here.

File details

Details for the file async_reduce-1.3-py3-none-any.whl.

File metadata

  • Download URL: async_reduce-1.3-py3-none-any.whl
  • Upload date:
  • Size: 12.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.12.0

File hashes

Hashes for async_reduce-1.3-py3-none-any.whl
Algorithm Hash digest
SHA256 c1fc6d5a9e494c0e3b72ba5fa65b7963c86f76d3fb35ee3d2691a69435ec0521
MD5 e2075d07f28086c585d40a17488d1d8d
BLAKE2b-256 98a9bebbb862a7ed384d6ba4faaf4ac8a0ac777b471f1efb3d2882b7487dd684

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page