Skip to main content

Python SDK for Crawlab

Project description

Crawlab SDK for Python

中文 | English

The SDK for Python contains two parts:

  1. CLI Tool
  2. Utility Tools

CLI Tool

The CLI Tool is mainly designed for those who are more comfortable using command line tools to interact with Crawlab.

The installation of the CLI Tool is simple:

pip install crawlab-sdk

Then, you can use the crawlab command in the command prompt to action with Crawlab.

Check the help document below, or you can refer to the official documentation (Chinese).

crawlab --help

Utility Tools

Utility tools mainly provide some helper methods to make it easier for you to integrate your spiders into Crawlab, e.g. saving results.

Below are integration methods of Scrapy and general Python spiders with Crawlab.

⚠️Note: make sure you have already installed crawlab-sdk using pip.

Scrapy Integration

In settings.py in your Scrapy project, find the variable named ITEM_PIPELINES (a dict variable). Add content below.

ITEM_PIPELINES = {
    'crawlab.pipelines.CrawlabMongoPipeline': 888,
}

Then, start the Scrapy spider. After it's done, you should be able to see scraped results in Task Detail -> Result

General Python Spider Integration

Please add below content to your spider files to save results.

# import result saving method
from crawlab import save_item

# this is a result record, must be dict type
result = {'name': 'crawlab'}

# call result saving method
save_item(result)

Then, start the spider. After it's done, you should be able to see scraped results in Task Detail -> Result

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

crawlab-sdk-0.6.b20211213-2123.tar.gz (19.2 kB view details)

Uploaded Source

Built Distribution

File details

Details for the file crawlab-sdk-0.6.b20211213-2123.tar.gz.

File metadata

  • Download URL: crawlab-sdk-0.6.b20211213-2123.tar.gz
  • Upload date:
  • Size: 19.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.7.1 importlib_metadata/4.8.2 pkginfo/1.8.2 requests/2.26.0 requests-toolbelt/0.9.1 tqdm/4.62.3 CPython/3.10.0

File hashes

Hashes for crawlab-sdk-0.6.b20211213-2123.tar.gz
Algorithm Hash digest
SHA256 5d757f3f7e4ed6b546614784042898306a4057b2b660af7f5119a6b3688fc4eb
MD5 ae12e001118b1e8f82961c49e201cb3e
BLAKE2b-256 267f97def0dfdf283ba428d627a03ba1b52eecf3c364f5d699dc696490ee4642

See more details on using hashes here.

File details

Details for the file crawlab_sdk-0.6b20211213.post2123-py3-none-any.whl.

File metadata

  • Download URL: crawlab_sdk-0.6b20211213.post2123-py3-none-any.whl
  • Upload date:
  • Size: 43.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.7.1 importlib_metadata/4.8.2 pkginfo/1.8.2 requests/2.26.0 requests-toolbelt/0.9.1 tqdm/4.62.3 CPython/3.10.0

File hashes

Hashes for crawlab_sdk-0.6b20211213.post2123-py3-none-any.whl
Algorithm Hash digest
SHA256 a1cef5cefd73554068bb2faec1e2566c33d641b4ab07d7c236bbc68db325fd31
MD5 aa72c977f2e22aeea415789ef138549d
BLAKE2b-256 5d541c7aa79dadc7d154eff5ae9337ab9452286c989571c6a973a70013076051

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page