Fetch, find and report broken links on all pages of a website.
If you are seeing this on GitHub, it is a mirror from Gitlab: https://gitlab.com/alexbenfica/check-links/
# What does it do?
Allows you to check for broken links in all internal pages of a website and more: - export results to HTML - export results to comma separated values - export results to tab separated values - can be used as library and return values
It is used for some specific tasks: - find broken urls on pages - as a desired side effect, load pages and forces caching creation - created to run in a daily basis reporting broken links of multi site network
# How to use it?
`python3 check_links.py --help`
It is simple and really fast as it reuses http socket connections!
Release history Release notifications | RSS feed
|Filename, size||File type||Python version||Upload date||Hashes|
|Filename, size checklinks-0.4.4-py2-none-any.whl (8.5 kB)||File type Wheel||Python version py2||Upload date||Hashes View|
|Filename, size checklinks-0.4.4-py3-none-any.whl (8.5 kB)||File type Wheel||Python version py3||Upload date||Hashes View|
|Filename, size checklinks-0.4.4.tar.gz (6.3 kB)||File type Source||Python version None||Upload date||Hashes View|
Hashes for checklinks-0.4.4-py2-none-any.whl
Hashes for checklinks-0.4.4-py3-none-any.whl