Async wrapper for requests / aiohttp, and some python crawler toolkits. Let synchronization code enjoy the performance of asynchronous programming. Read more: https://github.com/ClericPy/torequests.
Project description
torequests
Briefly speaking, requests / aiohttp wrapper for asynchronous programming rookie, to shorten the code quantity.
To install:
pip install torequests -U
requirements:
| requests
| futures # python2
| aiohttp >= 3.0.5 # python3
| uvloop # python3
optional:
| jsonpath_rw_ext
| lxml
| objectpath
| psutil
| fuzzywuzzy
| python-Levenshtein
| pyperclip
Features
Inspired by tomorrow, to make async-coding brief & smooth, compatible for win32 / python 2&3.
- convert any funtions into async-mode with concurrent.futures
- wrap requests module in future...
- simplify aiohttp, make it
requests-like
. - some crawler toolkits.
Getting started
1. Async, threads - make functions asynchronous
from torequests import threads, Async
import time
@threads(5)
def test1(n):
time.sleep(n)
return 'test1 ok'
def test2(n):
time.sleep(n)
return 'test1 ok'
start = int(time.time())
# here async_test2 is same as test1
async_test2 = Async(test2)
future = test1(1)
# future run in non blocking thread pool
print(future, ', %s s passed' % (int(time.time() - start)))
# call future.x will block main thread and get the future.result()
print(future.x, ', %s s passed' % (int(time.time() - start)))
# output:
# <NewFuture at 0x34b1d30 state=running> , 0 s passed
# test1 ok , 1 s passed
2. tPool - thread pool for async-requests
from torequests.main import tPool
from torequests.logs import print_info
req = tPool()
test_url = 'http://p.3.cn'
ss = [
req.get(
test_url,
retry=2,
callback=lambda x: (len(x.content), print_info(len(x.content))))
for i in range(3)
]
# or [i.x for i in ss]
req.x
ss = [i.cx for i in ss]
print_info(ss)
# [2019-04-01 00:19:07] temp_code.py(10): 612
# [2019-04-01 00:19:07] temp_code.py(10): 612
# [2019-04-01 00:19:07] temp_code.py(10): 612
# [2019-04-01 00:19:07] temp_code.py(16): [(612, None), (612, None), (612, None)]
2.1 Test the performance (win32+python3.7).
from torequests import tPool
import time
start_time = time.time()
trequests = tPool()
list1 = [
trequests.get('http://127.0.0.1:5000/test/%s' % num) for num in range(5000)
]
# If failed, i.x may return False by default,
# or you can reset the fail_return arg.
list2 = [i.x.text if i.x else 'fail' for i in list1]
end_time = time.time()
print(list2[:5], '\n5000 requests time cost: %s s' % (end_time - start_time))
# output:
# ['test ok 0', 'test ok 1', 'test ok 2', 'test ok 3', 'test ok 4']
# 5000 requests time cost: 5.906721591949463 s
3. Requests - aiohttp-wrapper
from torequests.dummy import Requests
from torequests.logs import print_info
trequests = Requests(frequencies={'p.3.cn': (2, 2)})
ss = [
trequests.get(
'http://p.3.cn', retry=1, timeout=5,
callback=lambda x: (len(x.content), print_info(trequests.frequencies)))
for i in range(4)
]
trequests.x
ss = [i.cx for i in ss]
print_info(ss)
# [2019-04-01 00:16:35] temp_code.py(7): {'p.3.cn': Frequency(sem=<1/2>, interval=2)}
# [2019-04-01 00:16:35] temp_code.py(7): {'p.3.cn': Frequency(sem=<0/2>, interval=2)}
# [2019-04-01 00:16:37] temp_code.py(7): {'p.3.cn': Frequency(sem=<2/2>, interval=2)}
# [2019-04-01 00:16:37] temp_code.py(7): {'p.3.cn': Frequency(sem=<2/2>, interval=2)}
# [2019-04-01 00:16:37] temp_code.py(12): [<NewResponse [200]>, <NewResponse [200]>, <NewResponse [200]>, <NewResponse [200]>]
3.1 win32+python3.7 cost 3.9s per 5000 requests, which may be much faster with uvloop.
from torequests.dummy import Requests
import time
start_time = time.time()
trequests = Requests()
list1 = [
trequests.get('http://127.0.0.1:5000/test/%s' % num) for num in range(5000)
]
# If failed, i.x may return False by default,
# or you can reset the fail_return arg.
list2 = [i.x.text if i.x else 'fail' for i in list1]
end_time = time.time()
print(list2[:5], '\n5000 requests time cost:%s s' % (end_time - start_time))
# output:
# win32, without uvloop;
# ['test ok 0', 'test ok 1', 'test ok 2', 'test ok 3', 'test ok 4']
# 5000 requests time cost:3.909820079803467 s
3.2 using torequests.dummy.Requests in async environment.
import asyncio
from responder import API
from torequests.dummy import Requests
loop = asyncio.get_event_loop()
api = API()
@api.route('/')
async def index(req, resp):
# await for request or FailureException
r = await api.req.get('http://p.3.cn', timeout=(1, 1))
print(r)
if r:
# including good request with status_code between 200 and 299
resp.text = 'ok' if 'Welcome to nginx!' in r.text else 'bad'
else:
resp.text = 'fail'
if __name__ == "__main__":
api.req = Requests(loop=loop)
api.run(port=5000, loop=loop)
3.3 mock server source code
from gevent.monkey import patch_all
patch_all()
import bottle
app = bottle.Bottle()
@app.get('/test/<num>')
def test(num):
return 'ok %s' % num
app.run(server='gevent', port=5000)
4. utils: some useful crawler toolkits
ClipboardWatcher: watch your clipboard changing.
Counts: counter while every time being called.
Null: will return self when be called, and alway be False.
Regex: Regex Mapper for string -> regex -> object.
Saver: simple object persistent toolkit with pickle/json.
Timer: timing tool.
UA: some common User-Agents for crawler.
curlparse: translate curl-string into dict of request.
md5: str(obj) -> md5_string.
print_mem: show the proc-mem-cost with psutil, use this only for lazinesssss.
ptime: %Y-%m-%d %H:%M:%S -> timestamp.
ttime: timestamp -> %Y-%m-%d %H:%M:%S
slice_by_size: slice a sequence into chunks, return as a generation of chunks with size.
slice_into_pieces: slice a sequence into n pieces, return a generation of n pieces.
timeago: show the seconds as human-readable.
unique: unique one sequence.
find_one: use regex like Javascript to find one string with index(like [0], [1]).
...
Documentation
License
Benchmarks
to be continued......
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distributions
No source distribution files available for this release.See tutorial on generating distribution archives.
Built Distributions
Close
Hashes for torequests-4.8.10-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 6fa8bb5a1b319008f0487ae0ad00f428ec1a9ec021e81757bcc67322dbf98d23 |
|
MD5 | a2097465183400fa404036f2a9d11b5c |
|
BLAKE2b-256 | b356e2348519e2087b17d8918bbeda099e4de73abb6e19abb5eb10e50d331066 |
Close
Hashes for torequests-4.8.10-py2-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 294714212661c3a7cfc3bc4f44087ff41561e06885ed20f90bf19c8a326c5ffc |
|
MD5 | 27799f24211ac344ad990c85b4cb534d |
|
BLAKE2b-256 | d8a2b64ada990e407fd8d97853280db131686a67817e63be6419ebee075b174b |