Skip to main content

Scrapy extension for database ingestion with job/spider tracking

Project description

Scrapy Item Ingest

A tiny, straightforward addon for Scrapy that saves your items, requests, and logs to PostgreSQL. No boilerplate, no ceremony.

Install

pip install scrapy-item-ingest

Minimal setup (settings.py)

ITEM_PIPELINES = {
    'scrapy_item_ingest.DbInsertPipeline': 300,
}

EXTENSIONS = {
    'scrapy_item_ingest.LoggingExtension': 500,
}

# Pick ONE of the two database config styles:
DB_URL = "postgresql://user:password@localhost:5432/database"
# Or use discrete fields (avoids URL encoding):
# DB_HOST = "localhost"
# DB_PORT = 5432
# DB_USER = "user"
# DB_PASSWORD = "password"
# DB_NAME = "database"

# Optional
CREATE_TABLES = True     # auto‑create tables on first run (default True)
JOB_ID = 1               # or omit; spider name will be used

Run your spider:

scrapy crawl your_spider

Troubleshooting

  • Password has special characters like @ or $?
    • In a URL, encode them: @ -> %40, $ -> %24.
    • Example: postgresql://user:PAK%40swat1%24@localhost:5432/db
    • Or use the discrete fields (no encoding needed).

Useful settings (optional)

  • LOG_DB_LEVEL (default: DEBUG) — minimum level stored in DB
  • LOG_DB_CAPTURE_LEVEL — capture level for Scrapy loggers routed to DB (does not affect console)
  • LOG_DB_LOGGERS — allowed logger prefixes (defaults always include [spider.name, 'scrapy'])
  • LOG_DB_EXCLUDE_LOGGERS (default: ['scrapy.core.scraper'])
  • LOG_DB_EXCLUDE_PATTERNS (default: ['Scraped from <'])
  • CREATE_TABLES (default: True) — create job_items, job_requests, job_logs on startup
  • ITEMS_TABLE, REQUESTS_TABLE, LOGS_TABLE — override table names

Links

License

MIT License. See LICENSE.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

scrapy_item_ingest-0.2.3.tar.gz (15.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

scrapy_item_ingest-0.2.3-py3-none-any.whl (19.3 kB view details)

Uploaded Python 3

File details

Details for the file scrapy_item_ingest-0.2.3.tar.gz.

File metadata

  • Download URL: scrapy_item_ingest-0.2.3.tar.gz
  • Upload date:
  • Size: 15.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.7

File hashes

Hashes for scrapy_item_ingest-0.2.3.tar.gz
Algorithm Hash digest
SHA256 8776f3c98fb1fed886cd6e925a66653aa86f392fe81ed904b4c3f46538a55b37
MD5 fe10f5a2987627d540478e33e28781b3
BLAKE2b-256 d388dc5169810c5c99e97821fcf9ba39d6ac5f0abf8330728a6c570c6c6a3777

See more details on using hashes here.

File details

Details for the file scrapy_item_ingest-0.2.3-py3-none-any.whl.

File metadata

File hashes

Hashes for scrapy_item_ingest-0.2.3-py3-none-any.whl
Algorithm Hash digest
SHA256 aa13a329eecb7c045fc98339fd9390b977c35d30eb9dcae59c92aa9dbb65cd1c
MD5 be78e55a833d03d79f4d92672eb600c2
BLAKE2b-256 1f16c12788cccc5522895802a54fcb4fc1aac42603213bd99210e9d68de1235c

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page