Skip to main content

Crawl any website and save every page as organized Markdown files

Project description

crawldown

CI PyPI version License: MIT Python 3.10+

Crawl any website and save every page as organized Markdown files.

crawldown mirrors a website's URL structure into a local directory of .md files — perfect for archiving documentation, feeding content into RAG pipelines, or reading offline.

crawldown https://docs.example.com --output ./docs-mirror
docs-mirror/
├── index.md
├── getting-started/
│   ├── index.md
│   └── installation.md
└── api/
    ├── reference.md
    └── authentication.md

Installation

pip install crawldown

Or with uv:

uv tool install crawldown

Quickstart

CLI

# Crawl an entire site
crawldown https://docs.example.com --output ./output

# Limit crawl depth
crawldown https://docs.example.com --output ./output --depth 2

# Add a delay between requests (seconds)
crawldown https://docs.example.com --output ./output --delay 0.5

# Skip robots.txt enforcement
crawldown https://docs.example.com --output ./output --no-robots

Python API

import asyncio
from crawldown import crawl

asyncio.run(crawl("https://docs.example.com", output_dir="./output"))

With options:

import asyncio
from crawldown import crawl
from crawldown.models import CrawlConfig

config = CrawlConfig(
    url="https://docs.example.com",
    output_dir="./output",
    max_depth=3,
    delay=0.5,
    respect_robots=True,
)

asyncio.run(crawl(config))

How it works

  1. Starts at the given URL and fetches the page using crawl4ai (handles JavaScript-rendered pages).
  2. Extracts all links that stay within the same domain and URL prefix.
  3. Converts each page to Markdown and saves it at a path matching the URL structure.
  4. Repeats for every discovered link up to max_depth (default: unlimited).

Options

Flag Default Description
--output, -o ./crawldown-output Directory to save Markdown files
--depth, -d unlimited Max link-follow depth
--delay 0.0 Seconds to wait between requests
--no-robots off Ignore robots.txt
--version Show version and exit

Contributing

We welcome contributions of all kinds. See CONTRIBUTING.md for how to get started.


License

MIT — see LICENSE.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

crawldown-0.1.0.tar.gz (267.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

crawldown-0.1.0-py3-none-any.whl (9.5 kB view details)

Uploaded Python 3

File details

Details for the file crawldown-0.1.0.tar.gz.

File metadata

  • Download URL: crawldown-0.1.0.tar.gz
  • Upload date:
  • Size: 267.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for crawldown-0.1.0.tar.gz
Algorithm Hash digest
SHA256 f8e10ee7278d6ade06d1253958c41ba278f7c0ff468bc8ef93444c44914c0855
MD5 57abcd9b481665635b97a5906ae5dc77
BLAKE2b-256 f60747d9c52e75d24cae6933128abadda0a4b48715639b4bbbc1a14e62d46f9b

See more details on using hashes here.

Provenance

The following attestation bundles were made for crawldown-0.1.0.tar.gz:

Publisher: release.yml on danilotpnta/crawldown

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file crawldown-0.1.0-py3-none-any.whl.

File metadata

  • Download URL: crawldown-0.1.0-py3-none-any.whl
  • Upload date:
  • Size: 9.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for crawldown-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 279070ac24c15bda1d7e677631550cdba01b26c23998b7e9c683618406f05dc0
MD5 91bc1506f05ab0a3b03f4843291905ed
BLAKE2b-256 1fcc146910475ab3f7fbe28f1c192b060a16a8ee2de55fa43c928b60d8a154a6

See more details on using hashes here.

Provenance

The following attestation bundles were made for crawldown-0.1.0-py3-none-any.whl:

Publisher: release.yml on danilotpnta/crawldown

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page