Skip to main content

Distributed script crawler framework.

Project description

Distributed script crawler framework.

# Simply connecting redis on the worker side of the library provides an-
# executable power for distributed scripts
# All function executions on the sender end will be piped to redis,
# and the worker end will be pulled out of the pipe to execute.
# Support multi-task simultaneous execution! Each execution maintains a taskid,
# and different tasks maintain their configuration space according to the-
# taskid when they are executed simultaneously.


# if in
import vredis
s = vredis.Worker.from_settings(host='xx.xx.xx.xx',port=6666,password='vredis')

# if in bash
C:\Users\Administrator>vredis worker -ho xx.xx.xx.xx -po 6666 -pa vredis -db 0
# if not set param. use defaults param.
# default host localhost
# default port 6379
# default password None
# default db 0


from vredis import pipe

pipe.DEBUG = True # True/False. worker prints on the worker_console.

# very low code intrusion, no decorator or even complete barrier-free execution
# The decorated function becomes a send function and is sent to the task pipeline
def some(i):
    import time, random
    rd = random.randint(1,2)
    print('use func:{}, rd time:{}'.format(i,rd))
    return 123
    # return a data and wraps them in JSON data and passes them in redis.

@pipe.table('mytable') # if not set table, use "default" as tablename
def some2(i):
    print('use func2:{}'.format(i))
    return 333,444
    # if return is a generator or list or tuple,
    # First, he iterates out the parameters and wraps them in JSON data and passes them in.
    # data collection space use tablename <= default tablename space "default".

for i in range(100):
    some(i) # first send task it will get a taskid. info will log out.


from vredis import pipe

for i in pipe.from_table(taskid=26):

# the second param is tablename. default tablename is "default"

Project details

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Built Distribution (37.2 kB view hashes)

Uploaded 3 6

Supported by

AWS AWS Cloud computing Datadog Datadog Monitoring Facebook / Instagram Facebook / Instagram PSF Sponsor Fastly Fastly CDN Google Google Object Storage and Download Analytics Huawei Huawei PSF Sponsor Microsoft Microsoft PSF Sponsor NVIDIA NVIDIA PSF Sponsor Pingdom Pingdom Monitoring Salesforce Salesforce PSF Sponsor Sentry Sentry Error logging StatusPage StatusPage Status page