Skip to main content

A Python Proxy Scraper for gathering fresh proxies.

Project description

ProxyEater[1.5.3]

version stars CodeFactor

A Python Proxy Scraper for gathering fresh proxies.

issues contributors

Install ProxyEater

To install ProxyEater, you can simply use the pip install ProxyEater command:

python -m pip install ProxyEater

Or you can clone the repository and run:

git clone https://github.com/MPCodeWriter21/ProxyEater
cd ProxyEater
python setup.py install

Usage

usage: ProxyEater [-h] [--source SOURCE] [--output OUTPUT] [--file-format { text, json, csv }]
                  [--format FORMAT] [--proxy-type PROXY_TYPE] [--include-status] [--threads
                  THREADS] [--timeout TIMEOUT] [--url URL] [--verbose] [--quiet] [--version]
                  [--proxy PROXY] [--useragent USERAGENT] [--include-geolocation] [--no-check]
                  [--source-format { text, json, csv }] [--default-type { http, https, socks4,
                  socks5 }]
                  mode

positional arguments:
  mode              Modes: Scrape, Check

options:
  -h, --help
                        show this help message and exit
  --source SOURCE, -s SOURCE
                        The source of the proxies(default:%localappdata%\
                        Python\Python310\lib\site-packages\ProxyEater\sources.json).
  --output OUTPUT, -o OUTPUT
                        The output file.
  --file-format { text, json, csv }, -ff { text, json, csv }
                        The format of the output file(default:text).
  --format FORMAT, -f FORMAT
                        The format for saving the proxies in text
                        file(default:"{scheme}://{ip}:{port}").
  --proxy-type PROXY_TYPE, -type PROXY_TYPE
                        The type of the proxies(default:all).
  --include-status, -is
                        Include the status of the proxies in the output file.
  --threads THREADS, -t THREADS
                        The number of threads to use for scraping(default:25).
  --timeout TIMEOUT, -to TIMEOUT
                        The timeout of the requests(default:15).
  --url URL, -u URL
                        The url to use for checking the proxies(default:http://icanhazip.com).
  --verbose, -v
                        The verbose of the program(default:False).
  --quiet, -q
                        The quiet of the program(default:False).
  --version, -V
                        The version of the program.

Scrape:
  Scrape mode arguments

  --proxy PROXY, -p PROXY
                        The proxy to use for scraping.
  --useragent USERAGENT, -ua USERAGENT
                        The useragent of the requests(default:random).
  --include-geolocation, -ig
                        Include the geolocation info of the proxies in the output file.
  --no-check, -nc
                        Use this option to skip the checking of the proxies after

Check:
  Check mode arguments

  --source-format { text, json, csv }, -sf { text, json, csv }
                        The format of the source file(default:text).
  --default-type { http, https, socks4, socks5 }, -dt { http, https, socks4, socks5 }
                        The default type of the proxies - Use this if you are providing proxies
                        without scheme(default:http).

About

Author: CodeWriter21 (Mehrad Pooryoussof)

GitHub: MPCodeWriter21

Telegram Channel: @CodeWriter21

Aparat Channel: CodeWriter21

License

License

apache-2.0

Donate

In order to support this project you can donate some crypto of your choice 8D

Donate Addresses

Or if you can't, give this project a star on GitHub :)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ProxyEater-1.5.3.tar.gz (18.7 kB view hashes)

Uploaded Source

Built Distribution

ProxyEater-1.5.3-py3-none-any.whl (19.1 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page