Skip to main content

Crawler integration with INSPIRE-HEP.

Project description

Crawler integration with INSPIRE-HEP using scrapy project HEPCrawl.

This module allows scheduling of crawler jobs to a Scrapyd instance serving a Scrapy project. E.g. in this case the default scrapy project is HEPCrawl.

It integrates directly with invenio-workflows module to create workflows for every record harvested by the crawler.

This module is meant to use only with INSPIRE-HEP overlay. Use at own risk.

Full documentation is hosted here:

See also documentation of HEPCrawl:

Project details

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Filename, size & hash SHA256 hash help File type Python version Upload date
inspire-crawler-3.0.3.tar.gz (35.4 kB) Copy SHA256 hash SHA256 Source None

Supported by

Elastic Elastic Search Pingdom Pingdom Monitoring Google Google BigQuery Sentry Sentry Error logging AWS AWS Cloud computing DataDog DataDog Monitoring Fastly Fastly CDN SignalFx SignalFx Supporter DigiCert DigiCert EV certificate StatusPage StatusPage Status page