Skip to main content

A CLI tool to export and import schema definitions and data from CockroachDB in SQL, JSON, YAML, or chunked CSV formats.

Project description

PyPI version Python versions License Build status

crdb-dump

A CLI tool to export and import schema definitions and data from CockroachDB in SQL, JSON, YAML, or chunked CSV formats.

Supports chunking, parallelism, resumability, diffing, manifest checksums, BYTES and UUID types, TLS auth, and dry-run safety.


🚀 Features

  • Export tables, views, sequences, and user-defined types
  • Output formats: SQL, JSON, YAML, CSV (with optional gzip)
  • Export BYTES as decode('<hex>', 'hex')
  • Handles UUIDs, TIMESTAMPS, arrays
  • Create per-table schema files or a unified schema file
  • Parallel + chunked data export with manifest and row tracking
  • Resumable COPY-based data import
  • Schema + data dry-run mode
  • Schema diffing against previous .sql
  • CLI output + logging to logs/
  • TLS certs or insecure connection supported
  • --print-connection shows full resolved DB URL (safe)

🔧 Installation

pip install crdb-dump

🥺 Local Testing

Run an integration test:

./test-local.sh

This performs:

  • Schema + data export (with BYTES, UUID)
  • Chunked CSV manifest creation
  • Dry-run import
  • Full schema and data reload

🗋 Usage

crdb-dump export --db=mydb [options]
crdb-dump load --db=mydb --schema=<.sql> --data-dir=...

🔐 Connection Options

export CRDB_URL="cockroachdb://root@localhost:26257/defaultdb?sslmode=disable"

or use flags:

--db mydb --host localhost --certs-dir ~/certs

🏠 Export Options

crdb-dump export --db=mydb --data --data-format=csv --chunk-size=1000
Option Description
--data Enable data export
--data-format csv or sql output
--data-compress Output .csv.gz instead
--chunk-size Split into fixed-row chunks
--per-table Write per-table files
--data-order Order rows (e.g., by id)
--data-order-desc Order descending
--data-parallel Export tables in parallel
--verify Check manifest SHA256s
--print-connection Show resolved DB connection URL
--archive Create .tar.gz from exported folder

🛬 Load Options

crdb-dump load \
  --db=mydb \
  --schema=defaultdb_schema.sql \
  --data-dir=export/defaultdb \
  --resume-log=resume.json \
  --print-connection \
  --dry-run
Option Description
--schema Load schema from .sql file
--data-dir Path containing chunked CSVs and manifests
--resume-log Resume tracking file for chunked load
--dry-run Don't execute, just print plan
--include-tables Restrict to specific table names
--exclude-tables Skip specific table names
--print-connection Print resolved CockroachDB connection

📂 Output Structure

By default, output is stored under:

crdb_dump_output/<db_name>/
├── defaultdb_schema.sql
├── table_users.sql
├── users_chunk_001.csv
├── users.manifest.json
├── logins_chunk_001.csv
├── logins.manifest.json

All logs go to:

logs/crdb_dump.log

📄 Example: Full Export + Verify + Import

crdb-dump export \
  --db=defaultdb \
  --data \
  --data-format=csv \
  --chunk-size=1000 \
  --per-table \
  --verify \
  --archive \
  --print-connection

crdb-dump load \
  --db=defaultdb \
  --schema=crdb_dump_output/defaultdb/defaultdb_schema.sql \
  --data-dir=crdb_dump_output/defaultdb \
  --resume-log=resume.json \
  --print-connection

🔍 Schema Diffing

crdb-dump export \
  --db=defaultdb \
  --diff=previous_schema.sql

This prints a unified diff and writes to:

crdb_dump_output/<db_name>/<db_name>_schema.diff

🤖 Test Coverage

  • pytest -m unit – runs fast unit tests
  • pytest -m integration – full Docker-based test
  • ./test-local.sh – end-to-end data roundtrip

🛠️ Developer Notes

  • Configured via pyproject.toml (PEP 621)
  • Click-based CLI
  • Tested with CRDB v25.2
  • CI runs all tests via GitHub Actions and Docker

👤 Author

Created by Virag Tripathi MIT License

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

crdb_dump-0.2.0.tar.gz (14.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

crdb_dump-0.2.0-py3-none-any.whl (15.4 kB view details)

Uploaded Python 3

File details

Details for the file crdb_dump-0.2.0.tar.gz.

File metadata

  • Download URL: crdb_dump-0.2.0.tar.gz
  • Upload date:
  • Size: 14.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.9.6

File hashes

Hashes for crdb_dump-0.2.0.tar.gz
Algorithm Hash digest
SHA256 3b924dd9db9d48e53c8fb144dd99b087f64ef8f8b7c2222825e31562e518aba9
MD5 6027836d732303e00e900fa572688250
BLAKE2b-256 d78d04b346c144b1301a2731871c85fd168438cc5d1bbfe3a555445e52a60c41

See more details on using hashes here.

File details

Details for the file crdb_dump-0.2.0-py3-none-any.whl.

File metadata

  • Download URL: crdb_dump-0.2.0-py3-none-any.whl
  • Upload date:
  • Size: 15.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.9.6

File hashes

Hashes for crdb_dump-0.2.0-py3-none-any.whl
Algorithm Hash digest
SHA256 08d4e6e49d5ca4c2acd56f48f227bc53cc4676cee51fa40af185f95eb7f9ca51
MD5 d533172169fd927ebb2cad9806f82034
BLAKE2b-256 8de3387c76cc49e70cad31e51f2e460966f7a05693e1c45b641e4882c7e21333

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page