Nasy Crawler Framework -- Never had such a pure crawler.
Table of Contents
- Development Process
Never had such a pure crawler like this
Although I often write crawlers, I don’t like to use huge frameworks, such as scrapy, but prefer
requests+bs4 or more general
requests_html. However, these two are inconvenient for a
crawler. E.g. Places, such as error retrying or parallel crawling, need to be handwritten by
myself. It is not very difficult to write it while writing too much can be tedious. Hence I
started writing this nacf (Nasy Crawler Framework), hoping to simplify some error retrying or
parallel writing of crawlers.
|requests-html||0.9.0||HTML Parsing for Humans.|
TODO Http Functions
TODO Fix an error from inspect.Parameter which caused the function parallel down.
- Ignored: An error caused by
- Help Wanted: Can someone help me about the Parameter?
- Commemorate Version: First Version
- Basic Functions.
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
|Filename, size||File type||Python version||Upload date||Hashes|
|Filename, size nacf-0.1.1-py3-none-any.whl (36.1 kB)||File type Wheel||Python version py3||Upload date||Hashes View|
|Filename, size nacf-0.1.1.tar.gz (13.4 kB)||File type Source||Python version None||Upload date||Hashes View|