Skip to main content

CTFd-Crawler is a tool designed to efficiently manage and download CTF challenge files.

Project description

CTFd-Crawler

Overview

CTFd-Crawler is a tool designed to efficiently manage and download CTF challenge files. It organizes downloads into categories, supports multi-threaded downloads for speed enhancement, logs all activities, and stores metadata in JSON format.

Features

  1. Downloads
    1. Organize downloaded files into subdirectories based on challenge categories.
    2. Implement multi-threading to download multiple files simultaneously, improving overall download speed.
    3. Create a detailed log of the download process, including any errors and warnings.
    4. Show detailed progress of downloading.
  2. It save description into description.txt in each challenge directory.
  3. Directory rules.
  4. If contents are in a directory, it create another directory (add numbering).
  5. If contents are not in a directory, the files just saved in it.
  6. All file / folder name including space is replaced with underscore.
  7. There are two options in loading information about CTF.
  8. Load from file (have to set directory when use crawler. if not, It basically set to current directory)
  9. Load from user input (not recommended, automatically saved into file)
  10. All information about ctf (name, token, url, download location) saved into file with json format for convenient access.
  11. Crawling all challenges and dump them into file too.

File Structure

Load

Before

.
└── ctf.json # contains basic information about CTF (refer to the sample folder)

After

.
├── ctf.json # add challenges information
└── challenges
    ├── pwn
    │   ├── challenge1
    │   │   ├── description.txt
    │   │   ├── file1
    │   │   └── file2
    │   └── challenge2
    │       ├── description.txt
    │       ├── file1
    │       └── file2
    └── rev
        ├── challenge1
        │   ├── description.txt
        │   ├── file1
        │   └── file2
        └── challenge2
            ├── description.txt
            ├── file1
            └── file2

Self Load

Before

.

After

.
├── ctf.json
└── challenges
    ├── pwn
    │   ├── challenge1
    │   │   ├── description.txt
    │   │   ├── file1
    │   └── challenge2
    │       ├── description.txt
    │       ├── file1
    └── rev
        ├── challenge1
        │   ├── description.txt
        │   ├── file1
        └── challenge2
            ├── description.txt
            ├── file1

Usage

from ctfd_crawling import CTFCrawler

crawler = CTFCrawler()
crawler.load("./h4ckinggame.json") # load from h4ckinggame.json
# crawler.self_load("h4ckinggame", "https://h4ckingga.me", "****************************************************************", "./h4ckinggame")
print("load")
res = (crawler.get_challenges())
print("get_challenges")
crawler.download_challenges(res)

you can choose one option between load and self_load. load is loading from file and self_load is loading from user input.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ctfd_crawler-0.0.8.tar.gz (5.2 kB view details)

Uploaded Source

Built Distribution

CTFd_Crawler-0.0.8-py3-none-any.whl (5.9 kB view details)

Uploaded Python 3

File details

Details for the file ctfd_crawler-0.0.8.tar.gz.

File metadata

  • Download URL: ctfd_crawler-0.0.8.tar.gz
  • Upload date:
  • Size: 5.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.12.4

File hashes

Hashes for ctfd_crawler-0.0.8.tar.gz
Algorithm Hash digest
SHA256 daf91786934976a8da04b37ad6d1fc96ec1628dbfc5dd9c12a1d740be291364b
MD5 37e09cd92e1b23d223481c75840dfd6e
BLAKE2b-256 4370daeae1af2f2870c0cd4b4c76157323a9e8a76c1b763f77d0b353c63f53e7

See more details on using hashes here.

File details

Details for the file CTFd_Crawler-0.0.8-py3-none-any.whl.

File metadata

File hashes

Hashes for CTFd_Crawler-0.0.8-py3-none-any.whl
Algorithm Hash digest
SHA256 d54fa8055d5f19d6782be521e34d5fb76e1789468aecc1bba197e657e8f17776
MD5 a56ec9aa06051cc2df41afa6e5e31483
BLAKE2b-256 6eaf1c2b32bc9b13b7deee6f274ff905f1b00cec64722daf1812e0bf6c989ebf

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page