Fetch, find and report broken links on all pages of a website.
If you are seeing this on GitHub, you should know that this is a mirror from Gitlab: https://gitlab.com/alexbenfica/check-links/ Issues and and milestones are all there!
# What does it do?
Allows you to check for broken links in all internal pages of a website and more: - export results to comma separated values (.csv) - can also be imported as module
It is used for some specific tasks: - find broken urls on pages - as a desired side effect, load pages and forces caching creation - created to run in a daily basis reporting broken links of multi site network
# How to use it?
You can download an run via command line: `python3 check_links.py --help`
You can install via pip3: `pip3 install -U checklinks`
It is simple and really fast as it reuses http socket connections!
Release history Release notifications | RSS feed
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
|Filename, size||File type||Python version||Upload date||Hashes|
|Filename, size checklinks-0.4.16918818-py3-none-any.whl (8.4 kB)||File type Wheel||Python version py3||Upload date||Hashes View|
|Filename, size checklinks-0.4.16918818.tar.gz (6.1 kB)||File type Source||Python version None||Upload date||Hashes View|
Hashes for checklinks-0.4.16918818-py3-none-any.whl