Quickly fetch multiple pages
Project description
Mucri
Quickly fetch a lot of pages/apis using python asyncio
.
Installation
Only python 3.6+
pip isntall mucri
Usage
fetch_pages
takes two args:
links
: list of links to be fetched (example below)
concurrency
: how many requests to be send at a time (default 20)
from mucri import fetch_pages
# links can be a single string or a dict with specific instructions
links = [
"http://meain.github.io",
{ "url": "http://somelink" },
{
"url": "http://fakelink",
"action": "get", # get | post
"data": {},
"headers": {},
"resp_type": "text", # text | json | image
}
]
results = fetch_pages() # fetches all of them asynchronously
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
mucri-0.0.6.tar.gz
(2.5 kB
view hashes)
Built Distribution
mucri-0.0.6-py3-none-any.whl
(2.7 kB
view hashes)