This is a pre-production deployment of Warehouse. Changes made here affect the production instance of PyPI (
Help us improve Python packaging - Donate today!

Scrapy middleware to add extra "magic" fields to items

Project Description

This is a Scrapy spider middleware to add extra fields to items, based on the configuration settings MAGIC_FIELDS and MAGIC_FIELDS_OVERRIDE.


Install scrapy-magicfields using pip:

$ pip install scrapy-magicfields


  1. Add MagicFieldsMiddleware by including it in SPIDER_MIDDLEWARES in your file:

        'scrapy_magicfields.MagicFieldsMiddleware': 100,

    Here, priority 100 is just an example. Set its value depending on other middlewares you may have enabled already.

  2. Enable the middleware using MAGIC_FIELDS (and optionally MAGIC_FIELDS_OVERRIDE) in your


Both settings MAGIC_FIELDS and MAGIC_FIELDS_OVERRIDE are dicts:

  • the keys are the destination field names,
  • their value is a string which accepts magic variables, — identified by a starting $ (dollar sign), which will be substituted by a corresponding value at runtime.

Some magic variables also accept arguments, and are specified after the magic name, using a : (column) as separator.

You can set project-global magics with MAGIC_FIELDS, and tune them for a specific spider using MAGIC_FIELDS_OVERRIDE.

In case there is more than one argument, they must come separated by , (comma sign). So the generic magic format is:

$<magic name>[:arg1,arg2,...]

Supported magic variables

the UTC timestamp at which the item was scraped, in format '%Y-%m-%d %H:%M:%S'.
the unixtime (number of seconds since the Epoch, i.e. time.time()) at which the item was scraped.
the UTC timestamp at which the item was scraped, with format '%Y-%m-%dT%H:%M:%S".
must be followed by an argument, which is the name of an attribute of the spider (like an argument passed to it).
the value of an environment variable. It acccepts as argument the name of the variable.
the job id (shortcut for $env:SCRAPY_JOB)
the UTC timestamp at which the job started, in format '%Y-%m-%d %H:%M:%S'.

Access to some response properties.

The url from where the item was extracted from.
Response http status.
Response http headers.
Access the given Scrapy setting. It accepts one argument: the name of the setting.
Allows to copy the value of one field to another Its argument is the source field. Effects are unpredicable if you use as source a field that is filled using magic fields.


The following configuration will add two fields to each scraped item:

  • 'timestamp', which will be filled with the string 'item scraped at <scraped timestamp>',
  • and 'spider', which will contain the spider name
    "timestamp": "item scraped at $time",
    "spider": "$spider:name"

The following configuration will copy the url to the field sku:

    "sku": "$field:url"

Magics also accept a regular expression argument which allows to extract and assign only part of the value generated by the magic. You have to specify it using the r'' notation.

Let’s pretend that the urls of your items look like '' and you want to assign to the sku field only the item number.

The following example, similar to the previous one but with a second regular expression argument, will do the task:

    "sku": "$field:url,r'item_no=(\d+)'"
Release History

Release History

This version
History Node


History Node


Download Files

Download Files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

File Name & Checksum SHA256 Checksum Help Version File Type Upload Date
scrapy_magicfields-1.1.0-py2.py3-none-any.whl (3.9 kB) Copy SHA256 Checksum SHA256 py2.py3 Wheel Jun 30, 2016
scrapy-magicfields-1.1.0.tar.gz (3.9 kB) Copy SHA256 Checksum SHA256 Source Jun 30, 2016

Supported By

WebFaction WebFaction Technical Writing Elastic Elastic Search Pingdom Pingdom Monitoring Dyn Dyn DNS Sentry Sentry Error Logging CloudAMQP CloudAMQP RabbitMQ Heroku Heroku PaaS Kabu Creative Kabu Creative UX & Design Fastly Fastly CDN DigiCert DigiCert EV Certificate Rackspace Rackspace Cloud Servers DreamHost DreamHost Log Hosting