Skip to main content
This is a pre-production deployment of Warehouse. Changes made here affect the production instance of PyPI (
Help us improve Python packaging - Donate today!

Scrapy-based Web Crawler with an UI

Project Description


Arachnado is a tool to crawl a specific website. It provides a Tornado-based HTTP API and a web UI for a Scrapy-based crawler.

License is MIT.


Arachnado requires Python 2.7. To install Arachnado use pip:

pip install arachnado

To install Arachnado with MongoDB support use this command:

pip install arachnado[mongo]


To start Arachnado execute arachnado command:


and then visit (or whatever URL is configured).

To see available command-line options use

arachnado –help

Arachnado can be configured using a config file. Put it to one of the common locations (‘/etc/arachnado.conf’, ‘~/.config/arachnado.conf’ or ‘~/.arachnado.conf’) or pass the file name as an argument when starting the server:

arachnado --config ./my-config.conf

For available options check


To build Arachnado static assets node.js + npm are required. Install all JavaScript requirements using npm - run the following command from the repo root:

npm install

then rebuild static files (we use Webpack):

npm run build

or auto-build static files on each change during development:

npm run watch


0.2 (2015-08-07)

Initial release.

Release History

This version
History Node


Download Files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Filename, Size & Hash SHA256 Hash Help File Type Python Version Upload Date
(156.3 kB) Copy SHA256 Hash SHA256
Wheel 2.7 Aug 7, 2015
(140.0 kB) Copy SHA256 Hash SHA256
Source None Aug 7, 2015

Supported By

Elastic Elastic Search Pingdom Pingdom Monitoring Dyn Dyn DNS Sentry Sentry Error Logging CloudAMQP CloudAMQP RabbitMQ Heroku Heroku PaaS Kabu Creative Kabu Creative UX & Design Fastly Fastly CDN DigiCert DigiCert EV Certificate Google Google Cloud Servers DreamHost DreamHost Log Hosting