Scrapy extenstion to control spiders using JSON-RPC
scrapy-jsonrpc is an extension to control a running Scrapy web crawler via JSON-RPC. The service provides access to the main Crawler object via the JSON-RPC 2.0 protocol.
The endpoint for accessing the crawler object is:
There is a command line tool provided for illustration purposes on how to build a client. You can find it in example-client.py. It supports a few basic commands such as listing the running spiders, etc.
These are the settings that control the web service behaviour:
A boolean which specifies if the web service will be enabled (provided its extension is also enabled).
A file to use for logging HTTP requests made to the web service. If unset web the log is sent to standard scrapy log.
Default: [6080, 7030]
The port range to use for the web service. If set to None or 0, a dynamically assigned port is used.
The interface the web service should listen on.
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
|File Name & Checksum SHA256 Checksum Help||Version||File Type||Upload Date|
|scrapy_jsonrpc-0.3.0-py2-none-any.whl (6.9 kB) Copy SHA256 Checksum SHA256||2.7||Wheel||Apr 13, 2015|
|scrapy-jsonrpc-0.3.0.tar.gz (5.2 kB) Copy SHA256 Checksum SHA256||–||Source||Apr 13, 2015|