Dead easy interface for executing many HTTP requests asynchronously. Also provides helper functions for executing embarrassingly parallel async coroutines.
Project description
many_requests
Dead easy interface for executing many HTTP requests asynchronously. It has been tested in the wild with over 10 million requests. Automatically handles errors and executes retries.
Built on-top of Trio and asks. Interface heavily inspired by Requests and joblib.
Also provides helper functions for executing embarrassingly parallel async coroutines.
To install:
pip install many-requests
Example Usage
Execute 10 GET requests for example.org:
from many_requests import ManyRequests
responses = ManyRequests(n_workers=5, n_connections=5)(
method='GET',
url=['https://example.org' for i in range(10)])
Query HackNews API for 10 items and parse JSON output:
responses = ManyRequests(n_workers=5, n_connections=5, json=True)(
method='GET',
url=[f'https://hacker-news.firebaseio.com/v0/item/{i}.json?print=pretty' for i in range(10)])
To use basic authentication with all requests:
from asks import BasicAuth
username = 'user'
password = 'pw'
responses = ManyRequests(n_workers=5, n_connections=5)(
method='GET',
url=['https://example.org' for i in range(10)],
auth=BasicAuth((username, password)))
To execute embarrassingly parallel async coroutines, for example 10 trio.sleep
calls:
from many_requests import EasyAsync, delayed
import trio
outputs = EasyAsync(n_workers = 4)(delayed(trio.sleep)(i) for i in range(10))
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for many_requests-0.2.1-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | cc13ea8bc65821192fdee3a922b32fe96e57c2cd4f007f09e629ed36de3724e5 |
|
MD5 | 4c6372bab95a551cdb0385476a665612 |
|
BLAKE2b-256 | 708b96f4dd0f325b6e47166779685cfdf850ec8c7c5c6e00bfa267d949d09b5c |