Skip to main content

No project description provided

Project description

PyPI Version Build Status Wheel Status Coverage report


Scrapy is a great framework for web crawling. This package provides two pipelines to save items into MongoDB in a async or sync way. And also provide a a highly customized way to interact with MongoDB in a async or sync way.

  • Save an item and get Object ID from this pipeline
  • Update an item and get Object ID from this pipeline


  • Txmongo, a async MongoDB driver with Twisted
  • Not support Python 2.7
  • Tests on Python 3.5, but it should work on other version higher then Python 3.3
  • Tests on Linux, but it’s a pure python module, it should work on other platforms with official python and Twisted supported, e.g. Windows, Mac OSX, BSD


The quick way:

pip install scrapy-pipeline-mongodb

Or put this middleware just beside the scrapy project.


Block Inspector in spider middleware, in, for example:

from txmongo.filter import ASCENDING
from txmongo.filter import DESCENDING

# -----------------------------------------------------------------------------
# -----------------------------------------------------------------------------

    'scrapy_pipeline_mongodb.pipelines.mongodb_async.PipelineMongoDBAsync': 500,

MONGODB_HOST = 'localhost'
MONGODB_DATABASE = 'test_mongodb_async_db'
MONGODB_COLLECTION = 'test_mongodb_async_coll'


MONGODB_INDEXES = [('field_0', ASCENDING, {'unique': True}),
                   (('field_0', 'field_1'), ASCENDING, {}),
                   (('field_0', ASCENDING, {}), ('field_0', DESCENDING, {}))]

MONGODB_PROCESS_ITEM = 'scrapy_pipeline_mongodb.utils.process_item.process_item'

Settings Reference


A string of the username of the database.


A string of the password of the database.


A string of the ip address or the domain of the database.


A int of the port of the database.


A string of the name of the database.


A list of the indexes to create on the collection.


Options can be attached when the pipeline start to connect to MongoBD.

If any options are needed, the name of the option can be with the prefix MONGODB_OPTIONS_, the pipeline will parse it.

For example:

option name in
authMechanism MONGODB_OPTIONS_authMechanism

For more options, please refer to the page:

Connection String URI Format — MongoDB Manual 3.4


A list of the indexes defined in this setting will be created when the spider is open.

If the index has already existed, there will be no warning or error raised.


To highly customize to interact with MongoDB, this pipeline provide a setting to define the function process_item. And with this package, there is one default function: just call the method insert_one of the collection to save the item into MongoDB, then return the item.

If a customize is provided to replace the default one, please note the behavior should follow the requirement which is clearly written in the scrapy documents:

Item Pipeline — Scrapy 1.4.0 documentation

Build-in Functions For Processing Item


This is a build-in function to call the method insert_one of the collection, and return the item.

To use this function, in

MONGODB_PROCESS_ITEM = 'scrapy_pipeline_mongodb.utils.process_item.process_item'


The drivers may have different api for the same operation, this pipeline adopts txmongo as the async driver for MongoDB, please read the relative documents to make sure the customized functions can run fluently in this pipeline.


  • Add a unit test for the index created function
  • Add a sync pipeline

Project details

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Files for scrapy-pipeline-mongodb, version 0.0.3
Filename, size File type Python version Upload date Hashes
Filename, size scrapy_pipeline_mongodb-0.0.3-py3-none-any.whl (11.2 kB) File type Wheel Python version py3 Upload date Hashes View
Filename, size scrapy-pipeline-mongodb-0.0.3.tar.gz (23.9 kB) File type Source Python version None Upload date Hashes View

Supported by

AWS AWS Cloud computing Datadog Datadog Monitoring DigiCert DigiCert EV certificate Facebook / Instagram Facebook / Instagram PSF Sponsor Fastly Fastly CDN Google Google Object Storage and Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Salesforce Salesforce PSF Sponsor Sentry Sentry Error logging StatusPage StatusPage Status page