Skip to main content

Escape Graphinder

Project description

Graphinder PyPI CI codecov

Graphinder is a tool that extracts all GraphQL endpoints from a given domain.

Banner

Docker Pulls Docker Image Size (latest by date) PyPI - Downloads

Run with docker

docker pull escapetech/graphinder
docker run -it --rm escapetech/graphinder -d example.com

If you want to save your results.json file, you can use:

docker run -it --name graphinder escapetech/graphinder -d example.com
docker cp graphinder:/graphinder/results.json results.json
docker rm -f graphinder

Or if you want to pass a file containing domain names (one per line):

docker run -v /full/path/to/file.csv:/graphinder/file.csv -it --rm escapetech/graphinder --inplace -f /graphinder/file.csv

Install using Pip

pip install graphinder

# using specific python binary
python3 -m pip install graphinder

Run it with

graphinder ...

Usage

A Scan consistes of:

  • Running on a specific domain (-d, --domain) or a list of domains (-f, --input-file).
  • Searching all scripts loaded by the browser for graphql endpoint (-s, --script)
  • Brute forcing the directories of all discovered urls (-b, --bruteforce)
  • Using precision mode (-p, --precision)

By default, bruteforce and script search are enabled.

graphinder -d example.com
graphinder -f domains.txt

Extra features

  • --no-bruteforce: Disable bruteforce
  • --no-script: Disable script search
  • -p --precision --no-precision: Enable/disable precision mode (default: enabled) (precision mode is slower but more accurate)
  • -f --input-file <FILE_PATH>: Input domain names from file
  • -w --max-workers <int>: Maximum of concurrent workers on multiple domains.
  • -o --output-file <FILE_PATH>: Output the results to file
  • -v --verbose --no-verbose: Verbose mode
  • -r --reduce: The maximum number of subdomains to scan.
  • -wb --webhook_url: The discord webhook url to send the results to.

If you experience any issues, irregularities or networking bottlenecks, please reduce your number of workers, otherwise, better is your network, the more workers you can have.

Local installation

Clone the repository and run the installation script

git clone https://github.com/Escape-Technologies/graphinder.git
cd Graphinder
./install-dev.sh

Run this command to enter the virtual enviroment

poetry shell

Profit !

graphinder -d example.com

How do you make sure this is a valid graphql endpoint ?

detector

Contributing

Pull requests are welcome. For major changes, please open an issue first to discuss what you would like to change.

Please make sure to update tests as appropriate.

License PyPI - License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

graphinder-1.11.6.tar.gz (16.5 kB view details)

Uploaded Source

Built Distribution

graphinder-1.11.6-py3-none-any.whl (21.3 kB view details)

Uploaded Python 3

File details

Details for the file graphinder-1.11.6.tar.gz.

File metadata

  • Download URL: graphinder-1.11.6.tar.gz
  • Upload date:
  • Size: 16.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.1 CPython/3.9.15

File hashes

Hashes for graphinder-1.11.6.tar.gz
Algorithm Hash digest
SHA256 9d5067cb49f66bc5bc7f20316ac53c7ced2c3c0f1da833d508a1a498398f273a
MD5 9a430007c89d6ce7b0c7ad0f16d3addb
BLAKE2b-256 286586b74f5e2d2bf0b1816ebc9ce766cdad11bd0f27b076a4c416dfd7d39f0c

See more details on using hashes here.

Provenance

File details

Details for the file graphinder-1.11.6-py3-none-any.whl.

File metadata

  • Download URL: graphinder-1.11.6-py3-none-any.whl
  • Upload date:
  • Size: 21.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.1 CPython/3.9.15

File hashes

Hashes for graphinder-1.11.6-py3-none-any.whl
Algorithm Hash digest
SHA256 6b2f488e909abc7728f9e6502e81606d35afb108712b95c9f752186a781590ae
MD5 09b597c7a7b0c05935bd30835deef4c8
BLAKE2b-256 45883b39135b7559ccb834c2689ea501788103005b9f45ffacda2bc50af4e2be

See more details on using hashes here.

Provenance

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page