Skip to main content

CTFd-Crawler is a tool designed to efficiently manage and download CTF challenge files.

Project description

CTFd-Crawler

Overview

CTFd-Crawler is a tool designed to efficiently manage and download CTF challenge files. It organizes downloads into categories, supports multi-threaded downloads for speed enhancement, logs all activities, and stores metadata in JSON format.

Features

  1. Downloads
    1. Organize downloaded files into subdirectories based on challenge categories.
    2. Implement multi-threading to download multiple files simultaneously, improving overall download speed.
    3. Create a detailed log of the download process, including any errors and warnings.
    4. Show detailed progress of downloading.
  2. It save description into description.txt in each challenge directory.
  3. Directory rules.
  4. If contents are in a directory, it create another directory (add numbering).
  5. If contents are not in a directory, the files just saved in it.
  6. All file / folder name including space is replaced with underscore.
  7. There are two options in loading information about CTF.
  8. Load from file (have to set directory when use crawler. if not, It basically set to current directory)
  9. Load from user input (not recommended, automatically saved into file)
  10. All information about ctf (name, token, url, download location) saved into file with json format for convenient access.
  11. Crawling all challenges and dump them into file too.

File Structure

Load

Before

.
└── ctf.json # contains basic information about CTF (refer to the sample folder)

After

.
├── ctf.json # add challenges information
└── challenges
    ├── pwn
    │   ├── challenge1
    │   │   ├── description.txt
    │   │   ├── file1
    │   │   └── file2
    │   └── challenge2
    │       ├── description.txt
    │       ├── file1
    │       └── file2
    └── rev
        ├── challenge1
        │   ├── description.txt
        │   ├── file1
        │   └── file2
        └── challenge2
            ├── description.txt
            ├── file1
            └── file2

Self Load

Before

.

After

.
├── ctf.json
└── challenges
    ├── pwn
    │   ├── challenge1
    │   │   ├── description.txt
    │   │   ├── file1
    │   └── challenge2
    │       ├── description.txt
    │       ├── file1
    └── rev
        ├── challenge1
        │   ├── description.txt
        │   ├── file1
        └── challenge2
            ├── description.txt
            ├── file1

Usage

from ctfd_crawling import CTFCrawler

crawler = CTFCrawler()
crawler.load("./h4ckinggame.json") # load from h4ckinggame.json
# crawler.self_load("h4ckinggame", "https://h4ckingga.me", "****************************************************************", "./h4ckinggame")
print("load")
res = (crawler.get_challenges())
print("get_challenges")
crawler.download_challenges(res)

you can choose one option between load and self_load. load is loading from file and self_load is loading from user input.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ctfd_crawler-0.0.5.tar.gz (5.2 kB view details)

Uploaded Source

Built Distribution

CTFd_Crawler-0.0.5-py3-none-any.whl (5.7 kB view details)

Uploaded Python 3

File details

Details for the file ctfd_crawler-0.0.5.tar.gz.

File metadata

  • Download URL: ctfd_crawler-0.0.5.tar.gz
  • Upload date:
  • Size: 5.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.11.9

File hashes

Hashes for ctfd_crawler-0.0.5.tar.gz
Algorithm Hash digest
SHA256 a411e06a0a7acf3623899332ace10571384cedaab17868d41947365621df9b2b
MD5 d3140c076a513aae21ffcf35c14f542b
BLAKE2b-256 60c753af6a1c995ddf1dc7de885f037fc438cb38af22f1b08e651be9839b6da4

See more details on using hashes here.

File details

Details for the file CTFd_Crawler-0.0.5-py3-none-any.whl.

File metadata

File hashes

Hashes for CTFd_Crawler-0.0.5-py3-none-any.whl
Algorithm Hash digest
SHA256 40a622902e2918c459aa24333a9315b2557e102d3ca16a8be62a229ba6546366
MD5 e49488dbc5abdb2979a54c462055ec31
BLAKE2b-256 e80ff2146b47c9a3f7a4eaf182a64f000586508ddc1cf8208b763441bf5d9658

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page