Skip to main content
This is a pre-production deployment of Warehouse. Changes made here affect the production instance of PyPI (
Help us improve Python packaging - Donate today!

Export scraped items of different types to multiple feeds.

Project Description

This Scrapy extension exports scraped items of different types to multiple feeds. By default each item gets its own feed.


$ pip install scrapy-multifeedexporter


You’ll have to switch the default FeedExporter with MultiFeedExporter by adding the following lines to the file of your spider:

from multifeedexporter import MultiFeedExporter

    'scrapy.contrib.feedexport.FeedExporter': None,
    'multifeedexporter.MultiFeedExporter': 500,

# Automatically configure available item names from your module
MULTIFEEDEXPORTER_ITEMS = MultiFeedExporter.get_bot_items(BOT_NAME)


When calling scrapy crawl you need to use the %(item_name)s placeholder in the output file/URI name. The following calls to scrapy crawl demonstrate the placeholder:

$ scrapy crawl -o "spider_name_%(item_name)s.csv" -t csv spider_name
$ scrapy crawl -o "" -t csv spider_name

If you omit the placeholder, all items will be placed in one file.


scrapy-multifeedexporter is published under MIT license

Release History

This version
History Node


History Node


Download Files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Filename, Size & Hash SHA256 Hash Help File Type Python Version Upload Date
(2.7 kB) Copy SHA256 Hash SHA256
Source None Oct 7, 2014

Supported By

Elastic Elastic Search Pingdom Pingdom Monitoring Dyn Dyn DNS Sentry Sentry Error Logging CloudAMQP CloudAMQP RabbitMQ Heroku Heroku PaaS Kabu Creative Kabu Creative UX & Design Fastly Fastly CDN DigiCert DigiCert EV Certificate Google Google Cloud Servers DreamHost DreamHost Log Hosting