Skip to main content

CTFd-Crawler is a tool designed to efficiently manage and download CTF challenge files.

Project description

CTFd-Crawler

Overview

CTFd-Crawler is a tool designed to efficiently manage and download CTF challenge files. It organizes downloads into categories, supports multi-threaded downloads for speed enhancement, logs all activities, and stores metadata in JSON format.

Features

  1. Downloads
    1. Organize downloaded files into subdirectories based on challenge categories.
    2. Implement multi-threading to download multiple files simultaneously, improving overall download speed.
    3. Create a detailed log of the download process, including any errors and warnings.
    4. Show detailed progress of downloading.
  2. It save description into description.txt in each challenge directory.
  3. Directory rules.
  4. If contents are in a directory, it create another directory (add numbering).
  5. If contents are not in a directory, the files just saved in it.
  6. All file / folder name including space is replaced with underscore.
  7. There are two options in loading information about CTF.
  8. Load from file (have to set directory when use crawler. if not, It basically set to current directory)
  9. Load from user input (not recommended, automatically saved into file)
  10. All information about ctf (name, token, url, download location) saved into file with json format for convenient access.
  11. Crawling all challenges and dump them into file too.

File Structure

Load

Before

.
└── ctf.json # contains basic information about CTF (refer to the sample folder)

After

.
├── ctf.json # add challenges information
└── challenges
    ├── pwn
    │   ├── challenge1
    │   │   ├── description.txt
    │   │   ├── file1
    │   │   └── file2
    │   └── challenge2
    │       ├── description.txt
    │       ├── file1
    │       └── file2
    └── rev
        ├── challenge1
        │   ├── description.txt
        │   ├── file1
        │   └── file2
        └── challenge2
            ├── description.txt
            ├── file1
            └── file2

Self Load

Before

.

After

.
├── ctf.json
└── challenges
    ├── pwn
    │   ├── challenge1
    │   │   ├── description.txt
    │   │   ├── file1
    │   └── challenge2
    │       ├── description.txt
    │       ├── file1
    └── rev
        ├── challenge1
        │   ├── description.txt
        │   ├── file1
        └── challenge2
            ├── description.txt
            ├── file1

Usage

from ctfd_crawling import CTFCrawler

crawler = CTFCrawler()
crawler.load("./h4ckinggame.json") # load from h4ckinggame.json
# crawler.self_load("h4ckinggame", "https://h4ckingga.me", "****************************************************************", "./h4ckinggame")
print("load")
res = (crawler.get_challenges())
print("get_challenges")
crawler.download_challenges(res)

you can choose one option between load and self_load. load is loading from file and self_load is loading from user input.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ctfd_crawler-0.1.1.tar.gz (5.2 kB view details)

Uploaded Source

Built Distribution

CTFd_Crawler-0.1.1-py3-none-any.whl (5.8 kB view details)

Uploaded Python 3

File details

Details for the file ctfd_crawler-0.1.1.tar.gz.

File metadata

  • Download URL: ctfd_crawler-0.1.1.tar.gz
  • Upload date:
  • Size: 5.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.12.4

File hashes

Hashes for ctfd_crawler-0.1.1.tar.gz
Algorithm Hash digest
SHA256 a0660ff34bf228e22722ed018ab023af953922a4a645e62b8882e246ced1c5e2
MD5 17a3df432533954cb1f5259645ef17a4
BLAKE2b-256 abc5a491ac640561502fa3441de9f50abe13253e1fe6d505baf889e7d979b9f2

See more details on using hashes here.

File details

Details for the file CTFd_Crawler-0.1.1-py3-none-any.whl.

File metadata

File hashes

Hashes for CTFd_Crawler-0.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 8ccd4e291b4e274fa92db21577687703d41ca40bdd402565c741de6554069bc7
MD5 1d3466aef893e45e5112918cf0a61511
BLAKE2b-256 6c9584c9f8043ef2a91ed40a883e4c48ad1f3a4a44cd94534707cff7bc468d2d

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page