Skip to main content

CTFd-Crawler is a tool designed to efficiently manage and download CTF challenge files.

Project description

CTFd-Crawler

Overview

CTFd-Crawler is a tool designed to efficiently manage and download CTF challenge files. It organizes downloads into categories, supports multi-threaded downloads for speed enhancement, logs all activities, and stores metadata in JSON format.

Features

  1. Downloads
    1. Organize downloaded files into subdirectories based on challenge categories.
    2. Implement multi-threading to download multiple files simultaneously, improving overall download speed.
    3. Create a detailed log of the download process, including any errors and warnings.
    4. Show detailed progress of downloading.
  2. It save description into description.txt in each challenge directory.
  3. Directory rules.
  4. If contents are in a directory, it create another directory (add numbering).
  5. If contents are not in a directory, the files just saved in it.
  6. All file / folder name including space is replaced with underscore.
  7. There are two options in loading information about CTF.
  8. Load from file (have to set directory when use crawler. if not, It basically set to current directory)
  9. Load from user input (not recommended, automatically saved into file)
  10. All information about ctf (name, token, url, download location) saved into file with json format for convenient access.
  11. Crawling all challenges and dump them into file too.

File Structure

Load

Before

.
└── ctf.json # contains basic information about CTF (refer to the sample folder)

After

.
├── ctf.json # add challenges information
└── challenges
    ├── pwn
    │   ├── challenge1
    │   │   ├── description.txt
    │   │   ├── file1
    │   │   └── file2
    │   └── challenge2
    │       ├── description.txt
    │       ├── file1
    │       └── file2
    └── rev
        ├── challenge1
        │   ├── description.txt
        │   ├── file1
        │   └── file2
        └── challenge2
            ├── description.txt
            ├── file1
            └── file2

Self Load

Before

.

After

.
├── ctf.json
└── archive
    ├── pwn
    │   ├── challenge1
    │   │   ├── description.txt
    │   │   ├── file1
    │   └── challenge2
    │       ├── description.txt
    │       ├── file1
    └── rev
        ├── challenge1
        │   ├── description.txt
        │   ├── file1
        └── challenge2
            ├── description.txt
            ├── file1

Usage

from CTFd_Crawler import CTFCrawler

crawler = CTFCrawler()
# crawler.self_load("test_ctf", "https://ctfd.based/site", "****************************************************************", "./test_ctf")
crawler.load("./test_ctf.json")
print("load")
print(crawler.important)
res = crawler.get_challenges()
print("get_challenges")
crawler.download_challenges()

you can choose one option between load and self_load. load is loading from file and self_load is loading from user input.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ctfd_crawler-0.1.3.tar.gz (5.2 kB view details)

Uploaded Source

Built Distribution

CTFd_Crawler-0.1.3-py3-none-any.whl (5.8 kB view details)

Uploaded Python 3

File details

Details for the file ctfd_crawler-0.1.3.tar.gz.

File metadata

  • Download URL: ctfd_crawler-0.1.3.tar.gz
  • Upload date:
  • Size: 5.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.12.4

File hashes

Hashes for ctfd_crawler-0.1.3.tar.gz
Algorithm Hash digest
SHA256 2cc2c9f7bea18b77f964003eb9ac7054001816e0e72c2cb2cd5fb96b293b37b2
MD5 e3cfe64f3f423d39dd7d69a24cad9283
BLAKE2b-256 b3c1fea40344bc104cd146349c917688c5a9bd8e1c065d4cb708434fb9ce253b

See more details on using hashes here.

File details

Details for the file CTFd_Crawler-0.1.3-py3-none-any.whl.

File metadata

File hashes

Hashes for CTFd_Crawler-0.1.3-py3-none-any.whl
Algorithm Hash digest
SHA256 a4463b5c2c465abb247c2d51fa751f9cac143f650d897ab81aacc8cfb3981d78
MD5 2dd0563ba1d477145b353c1c04c60bae
BLAKE2b-256 479d5b4383fd898dee7283199b84732d5f57330e796b33f19dd00db9ff9748a2

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page