Scrapy extenstion to control spiders using JSON-RPC
scrapy-jsonrpc is an extension to control a running Scrapy web crawler via JSON-RPC. The service provides access to the main Crawler object via the JSON-RPC 2.0 protocol.
The endpoint for accessing the crawler object is:
There is a command line tool provided for illustration purposes on how to build a client. You can find it in example-client.py. It supports a few basic commands such as listing the running spiders, etc.
These are the settings that control the web service behaviour:
A boolean which specifies if the web service will be enabled (provided its extension is also enabled).
A file to use for logging HTTP requests made to the web service. If unset web the log is sent to standard scrapy log.
Default: [6080, 7030]
The port range to use for the web service. If set to None or 0, a dynamically assigned port is used.
The interface the web service should listen on.
|Filename, size||File type||Python version||Upload date||Hashes|
|Filename, size scrapy_jsonrpc-0.3.0-py2-none-any.whl (6.9 kB)||File type Wheel||Python version 2.7||Upload date||Hashes View hashes|
|Filename, size scrapy-jsonrpc-0.3.0.tar.gz (5.2 kB)||File type Source||Python version None||Upload date||Hashes View hashes|
Hashes for scrapy_jsonrpc-0.3.0-py2-none-any.whl