Brings automatic support for robots.txt files in requests.
Currently just a proof of concept, the module strives to be an extension to requests that brings automatic support for robots.txt.
How to use
Simply use RobotsAwareSession instead of the built-in requests.Session. If a resource is not allowed, a RobotsTxtDisallowed exception is raised.
How do I run the tests?
The easiest way would be to extract the source tarball and run:
$ python test/test_robotstxt.py
- initial published version
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
|Filename, size||File type||Python version||Upload date||Hashes|
|Filename, size requests-robotstxt-0.1.0.tar.gz (3.9 kB)||File type Source||Python version None||Upload date||Hashes View|