Skip to main content

A Python Proxy Scraper for gathering fresh proxies.

Project description

ProxyEater[1.5.3]

version stars CodeFactor

A Python Proxy Scraper for gathering fresh proxies.

issues contributors

Install ProxyEater

To install ProxyEater, you can simply use the pip install ProxyEater command:

python -m pip install ProxyEater

Or you can clone the repository and run:

git clone https://github.com/MPCodeWriter21/ProxyEater
cd ProxyEater
python setup.py install

Usage

usage: ProxyEater [-h] [--source SOURCE] [--output OUTPUT] [--file-format { text, json, csv }]
                  [--format FORMAT] [--proxy-type PROXY_TYPE] [--include-status] [--threads
                  THREADS] [--timeout TIMEOUT] [--url URL] [--verbose] [--quiet] [--version]
                  [--proxy PROXY] [--useragent USERAGENT] [--include-geolocation] [--no-check]
                  [--source-format { text, json, csv }] [--default-type { http, https, socks4,
                  socks5 }]
                  mode

positional arguments:
  mode              Modes: Scrape, Check

options:
  -h, --help
                        show this help message and exit
  --source SOURCE, -s SOURCE
                        The source of the proxies(default:%localappdata%\
                        Python\Python310\lib\site-packages\ProxyEater\sources.json).
  --output OUTPUT, -o OUTPUT
                        The output file.
  --file-format { text, json, csv }, -ff { text, json, csv }
                        The format of the output file(default:text).
  --format FORMAT, -f FORMAT
                        The format for saving the proxies in text
                        file(default:"{scheme}://{ip}:{port}").
  --proxy-type PROXY_TYPE, -type PROXY_TYPE
                        The type of the proxies(default:all).
  --include-status, -is
                        Include the status of the proxies in the output file.
  --threads THREADS, -t THREADS
                        The number of threads to use for scraping(default:25).
  --timeout TIMEOUT, -to TIMEOUT
                        The timeout of the requests(default:15).
  --url URL, -u URL
                        The url to use for checking the proxies(default:http://icanhazip.com).
  --verbose, -v
                        The verbose of the program(default:False).
  --quiet, -q
                        The quiet of the program(default:False).
  --version, -V
                        The version of the program.

Scrape:
  Scrape mode arguments

  --proxy PROXY, -p PROXY
                        The proxy to use for scraping.
  --useragent USERAGENT, -ua USERAGENT
                        The useragent of the requests(default:random).
  --include-geolocation, -ig
                        Include the geolocation info of the proxies in the output file.
  --no-check, -nc
                        Use this option to skip the checking of the proxies after

Check:
  Check mode arguments

  --source-format { text, json, csv }, -sf { text, json, csv }
                        The format of the source file(default:text).
  --default-type { http, https, socks4, socks5 }, -dt { http, https, socks4, socks5 }
                        The default type of the proxies - Use this if you are providing proxies
                        without scheme(default:http).

About

Author: CodeWriter21 (Mehrad Pooryoussof)

GitHub: MPCodeWriter21

Telegram Channel: @CodeWriter21

Aparat Channel: CodeWriter21

License

License

apache-2.0

Donate

In order to support this project you can donate some crypto of your choice 8D

Donate Addresses

Or if you can't, give this project a star on GitHub :)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ProxyEater-1.5.3.tar.gz (18.7 kB view details)

Uploaded Source

Built Distribution

ProxyEater-1.5.3-py3-none-any.whl (19.1 kB view details)

Uploaded Python 3

File details

Details for the file ProxyEater-1.5.3.tar.gz.

File metadata

  • Download URL: ProxyEater-1.5.3.tar.gz
  • Upload date:
  • Size: 18.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.9.17

File hashes

Hashes for ProxyEater-1.5.3.tar.gz
Algorithm Hash digest
SHA256 0be465dcbb9942ed1fe417610514fe5effbad50ef8458873874bc143eaf1f302
MD5 42e695caeb59ba184bdc3b1400f7ad76
BLAKE2b-256 1cb131038d5248cf1ef6a9e274408969a49899a6d699dab3da01dbd16986b88d

See more details on using hashes here.

File details

Details for the file ProxyEater-1.5.3-py3-none-any.whl.

File metadata

  • Download URL: ProxyEater-1.5.3-py3-none-any.whl
  • Upload date:
  • Size: 19.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.9.17

File hashes

Hashes for ProxyEater-1.5.3-py3-none-any.whl
Algorithm Hash digest
SHA256 f03a664404e8a933fd7950541308a7bd604faa2f0fb527bbeb6d2ce7492302d8
MD5 a7288d302b34978fe2da8923be912a79
BLAKE2b-256 0e24a789b6934ccacac61ec38da396fa73f0390bbdef1f3c7f151510e6af9a4d

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page