A Python Proxy Scraper for gathering fresh proxies.
Project description
ProxyEater[1.5.1]
A Python Proxy Scraper for gathering fresh proxies.
Install ProxyEater
To install ProxyEater, you can simply use the pip install ProxyEater
command:
python -m pip install ProxyEater
Or you can clone the repository and run:
git clone https://github.com/MPCodeWriter21/ProxyEater
cd ProxyEater
python setup.py install
Usage
usage: ProxyEater [-h] [--source SOURCE] [--output OUTPUT] [--file-format { text, json, csv }]
[--format FORMAT] [--proxy-type PROXY_TYPE] [--include-status] [--threads
THREADS] [--timeout TIMEOUT] [--url URL] [--verbose] [--quiet] [--version]
[--proxy PROXY] [--useragent USERAGENT] [--include-geolocation] [--no-check]
[--source-format { text, json, csv }] [--default-type { http, https, socks4,
socks5 }]
mode
positional arguments:
mode Modes: Scrape, Check
options:
-h, --help
show this help message and exit
--source SOURCE, -s SOURCE
The source of the proxies(default:C:\Users\Morteza\AppData\Local\Programs\
Python\Python310\lib\site-packages\ProxyEater\sources.json).
--output OUTPUT, -o OUTPUT
The output file.
--file-format { text, json, csv }, -ff { text, json, csv }
The format of the output file(default:text).
--format FORMAT, -f FORMAT
The format for saving the proxies in text
file(default:"{scheme}://{ip}:{port}").
--proxy-type PROXY_TYPE, -type PROXY_TYPE
The type of the proxies(default:all).
--include-status, -is
Include the status of the proxies in the output file.
--threads THREADS, -t THREADS
The number of threads to use for scraping(default:25).
--timeout TIMEOUT, -to TIMEOUT
The timeout of the requests(default:15).
--url URL, -u URL
The url to use for checking the proxies(default:http://icanhazip.com).
--verbose, -v
The verbose of the program(default:False).
--quiet, -q
The quiet of the program(default:False).
--version, -V
The version of the program.
Scrape:
Scrape mode arguments
--proxy PROXY, -p PROXY
The proxy to use for scraping.
--useragent USERAGENT, -ua USERAGENT
The useragent of the requests(default:random).
--include-geolocation, -ig
Include the geolocation info of the proxies in the output file.
--no-check, -nc
Use this option to skip the checking of the proxies after
Check:
Check mode arguments
--source-format { text, json, csv }, -sf { text, json, csv }
The format of the source file(default:text).
--default-type { http, https, socks4, socks5 }, -dt { http, https, socks4, socks5 }
The default type of the proxies - Use this if you are providing proxies
without scheme(default:http).
About
Author: CodeWriter21 (Mehrad Pooryoussof)
GitHub: MPCodeWriter21
Telegram Channel: @CodeWriter21
Aparat Channel: CodeWriter21
License
Donate
In order to support this project you can donate some crypto of your choice 8D
Or if you can't, give this project a star on GitHub :)
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
ProxyEater-1.5.1.tar.gz
(17.8 kB
view hashes)
Built Distribution
ProxyEater-1.5.1-py3-none-any.whl
(18.3 kB
view hashes)
Close
Hashes for ProxyEater-1.5.1-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | ae6cf2bcbebffede54efcb225b452567c79cbe944473d4761c3e376da153fbda |
|
MD5 | b7df95dfcdc2421df878c4dbdcf061dc |
|
BLAKE2b-256 | 09983c627cb464a675737c56e25a51d4e1af5d14bd97f8d9e9e9cb26e49cb6aa |