Skip to main content

A robots.txt parser alternative to Python's robotparser module

Project description

Robotexclusionrulesparser is an alternative to the Python standard library module robotparser. It fetches and parses robots.txt files and can answer questions as to whether or not a given user agent is permitted to visit a certain URL.

This module has some features that the standard library module robotparser does not, including the ability to decode non-ASCII robots.txt files, respect for Expires headers and understanding of Crawl-delay and Sitemap directives and wildcard syntax in path names.

Complete documentation (including a comparison with the standard library module robotparser) is available in ReadMe.html.

Robotexclusionrulesparser is released under a BSD license.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Filename, size & hash SHA256 hash help File type Python version Upload date
robotexclusionrulesparser-1.7.1.tar.gz (31.5 kB) Copy SHA256 hash SHA256 Source None Aug 12, 2016

Supported by

Elastic Elastic Search Pingdom Pingdom Monitoring Google Google BigQuery Sentry Sentry Error logging AWS AWS Cloud computing DataDog DataDog Monitoring Fastly Fastly CDN DigiCert DigiCert EV certificate StatusPage StatusPage Status page