Brings automatic support for robots.txt files in requests.
Project description
Currently just a proof of concept, the module strives to be an extension to requests that brings automatic support for robots.txt.
How to use
Simply use RobotsAwareSession instead of the built-in requests.Session. If a resource is not allowed, a RobotsTxtDisallowed exception is raised.
How do I run the tests?
The easiest way would be to extract the source tarball and run:
$ python test/test_robotstxt.py
Change Log
0.1.0
initial published version
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.