Python SDK for Crawlab
Project description
Crawlab SDK for Python
中文 | English
The SDK for Python contains two parts:
- CLI Tool
- Utility Tools
CLI Tool
The CLI Tool is mainly designed for those who are more comfortable using command line tools to interact with Crawlab.
The installation of the CLI Tool is simple:
pip install crawlab-sdk
Then, you can use the crawlab
command in the command prompt to action with Crawlab.
Check the help document below, or you can refer to the official documentation (Chinese).
crawlab --help
Utility Tools
Utility tools mainly provide some helper
methods to make it easier for you to integrate your spiders into Crawlab, e.g. saving results.
Below are integration methods of Scrapy and general Python spiders with Crawlab.
⚠️Note: make sure you have already installed crawlab-sdk
using pip.
Scrapy Integration
In settings.py
in your Scrapy project, find the variable named ITEM_PIPELINES
(a dict
variable). Add content below.
ITEM_PIPELINES = {
'crawlab.pipelines.CrawlabMongoPipeline': 888,
}
Then, start the Scrapy spider. After it's done, you should be able to see scraped results in Task Detail -> Result
General Python Spider Integration
Please add below content to your spider files to save results.
# import result saving method
from crawlab import save_item
# this is a result record, must be dict type
result = {'name': 'crawlab'}
# call result saving method
save_item(result)
Then, start the spider. After it's done, you should be able to see scraped results in Task Detail -> Result
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for crawlab_sdk-0.6.2-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | febe5d42f670d7e3723094d4a02628d21840aaa679cc14c7243b238a3fbc2bf1 |
|
MD5 | cf1b982307363e8e2002900e8a5e56ef |
|
BLAKE2b-256 | 9adeac3e1b59bcad6339f93fcbdcf8ec8d58e9e08ad7f6788de59ba3ff7f32a5 |