Skip to main content

Python SDK for Crawlab

Project description

Crawlab SDK for Python

中文 | English

The SDK for Python contains two parts:

  1. CLI Tool
  2. Utility Tools

CLI Tool

The CLI Tool is mainly designed for those who are more comfortable using command line tools to interact with Crawlab.

The installation of the CLI Tool is simple:

pip install crawlab-sdk

Then, you can use the crawlab command in the command prompt to action with Crawlab.

Check the help document below, or you can refer to the official documentation (Chinese).

crawlab --help

Utility Tools

Utility tools mainly provide some helper methods to make it easier for you to integrate your spiders into Crawlab, e.g. saving results.

Below are integration methods of Scrapy and general Python spiders with Crawlab.

⚠️Note: make sure you have already installed crawlab-sdk using pip.

Scrapy Integration

In settings.py in your Scrapy project, find the variable named ITEM_PIPELINES (a dict variable). Add content below.

ITEM_PIPELINES = {
    'crawlab.pipelines.CrawlabMongoPipeline': 888,
}

Then, start the Scrapy spider. After it's done, you should be able to see scraped results in Task Detail -> Result

General Python Spider Integration

Please add below content to your spider files to save results.

# import result saving method
from crawlab import save_item

# this is a result record, must be dict type
result = {'name': 'crawlab'}

# call result saving method
save_item(result)

Then, start the spider. After it's done, you should be able to see scraped results in Task Detail -> Result

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

crawlab-sdk-0.6.b20211222-1649.tar.gz (19.3 kB view details)

Uploaded Source

Built Distribution

File details

Details for the file crawlab-sdk-0.6.b20211222-1649.tar.gz.

File metadata

  • Download URL: crawlab-sdk-0.6.b20211222-1649.tar.gz
  • Upload date:
  • Size: 19.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.7.1 importlib_metadata/4.10.0 pkginfo/1.8.2 requests/2.26.0 requests-toolbelt/0.9.1 tqdm/4.62.3 CPython/3.10.1

File hashes

Hashes for crawlab-sdk-0.6.b20211222-1649.tar.gz
Algorithm Hash digest
SHA256 e2ce85ce980fd963a41c25f766a74b6a15091b2dc2170f1b49252f54dc725563
MD5 84962ce75ded0eaa6412a662cb5bc960
BLAKE2b-256 5f157452bd09e7c4660594be4362d2b26a547c72092e33be3ba6c5dd1bb8cd69

See more details on using hashes here.

File details

Details for the file crawlab_sdk-0.6b20211222.post1649-py3-none-any.whl.

File metadata

  • Download URL: crawlab_sdk-0.6b20211222.post1649-py3-none-any.whl
  • Upload date:
  • Size: 43.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.7.1 importlib_metadata/4.10.0 pkginfo/1.8.2 requests/2.26.0 requests-toolbelt/0.9.1 tqdm/4.62.3 CPython/3.10.1

File hashes

Hashes for crawlab_sdk-0.6b20211222.post1649-py3-none-any.whl
Algorithm Hash digest
SHA256 026294a6ee5df6d03140bd7503d70183a8b982ca296084939b9e00afb80b9bc0
MD5 e1146a6b607284c1b05f2caec321e029
BLAKE2b-256 1c1eae85c4eb67e183cf00f2d07bec351263ada6383a0ee15f39b8d9cf9e9725

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page