Skip to main content

A robots.txt parser alternative to Python's robotparser module

Project description

Robotexclusionrulesparser is an alternative to the Python standard library module robotparser. It fetches and parses robots.txt files and can answer questions as to whether or not a given user agent is permitted to visit a certain URL.

This module has some features that the standard library module robotparser does not, including the ability to decode non-ASCII robots.txt files, respect for Expires headers and understanding of Crawl-delay and Sitemap directives and wildcard syntax in path names.

Complete documentation (including a comparison with the standard library module robotparser) is available in ReadMe.html.

Robotexclusionrulesparser is released under a BSD license.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

robotexclusionrulesparser-1.7.1.tar.gz (31.5 kB view details)

Uploaded Source

File details

Details for the file robotexclusionrulesparser-1.7.1.tar.gz.

File metadata

File hashes

Hashes for robotexclusionrulesparser-1.7.1.tar.gz
Algorithm Hash digest
SHA256 d23aa14ae8145c13c95612d696736bad52a4bd0819ce8c9437ee745098fb8388
MD5 f11ccefc9ec9397db8fc8e62b79c93ef
BLAKE2b-256 399774634de03a0856160a8c2fa92f03cdf1827c3b1d3d42378d4b79119cd9fa

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page