Skip to main content

AI-assisted CLI for organizing files.

Project description

CI PyPI - Version PyPI - Python Version GitHub License

Dorgy

dorgy logo

AI‑assisted CLI to keep growing collections of files tidy. Organize folders with safe renames/moves and undo, watch directories for changes, and search collections with substring or semantic queries — all powered by portable per‑collection state.

What It Does

Before (a messy folder):

my_docs/
  IMG_0234.jpg
  Scan_001.pdf
  taxes.txt
  contract_final_FINAL.docx
  notes (1).txt
  2023-05-07 14.23.10.png
  invoice.pdf

After (organized by category/date with safe renames, hyphenated lower‑case folders):

my_docs/
  .dorgy/                     # state, history, search index, logs
  documents/
    contracts/
      Employment Agreement (2023-06-15).pdf
    taxes/
      2023/
        Tax Notes.txt
  photos/
    2023/05/
      2023-05-07 14-23-10.png
  invoices/
    2023/
      ACME - April.pdf

Exact destinations depend on your config and prompts; all moves are reversible via dorgy undo using the state in .dorgy.

Installation

PyPI (recommended)

pip install dorgy

From source (contributors)

git clone https://github.com/bryaneburr/dorgy.git
cd dorgy

# Optional: install dev dependencies
uv sync --extra dev

# Optional: editable install
uv pip install -e .

Getting Started

# Inspect available commands
dorgy --help

# Organize a directory in place (dry run first)
dorgy org ./documents --dry-run
dorgy org ./documents

# Monitor a directory and emit JSON batches
dorgy watch ./inbox --json --once

# Undo the latest plan
dorgy undo ./documents --dry-run
dorgy status ./documents --json

See the docs for guides on Organize, Watch, Search, Move/Undo, and configuration details.

Configuring LLM access

Set language model credentials and defaults via dorgy config commands or the YAML file at ~/.dorgy/config.yaml. Important fields include:

  • llm.model — full LiteLLM/DSPy model identifier (e.g., openai/gpt-4o-mini, openrouter/gpt-4.1).
  • llm.api_key — API token for the selected provider (keep this in environment variables for security, e.g., export DORGY__LLM__API_KEY=...).
  • llm.api_base_url — optional custom gateway URL (useful for openrouter, proxies, or self-hosted backends).
  • llm.temperature / llm.max_tokens — sampling parameters that shape response creativity and length.

To override values temporarily, export environment variables following the DORGY__SECTION__KEY scheme—for example:

export DORGY__LLM__MODEL="openai/gpt-4o-mini"
export DORGY__LLM__API_KEY="sk-example"
export DORGY__LLM__API_BASE_URL="https://api.openai.com/v1"

Then run CLI commands as usual (dorgy org, dorgy watch, etc.).

LLM Recommendations

We've tested dorgy with a number of LLMs and providers, and we've found the following to perform well:

  • GPT-5
  • Gemini 2.5
  • If you use OpenRouter, the openrouter/auto model is an interesting choice.

Documentation

  • Published site: https://bryaneburr.github.io/dorgy/
  • Source: docs/ (MkDocs + shadcn)
  • Start with Getting Started → Quickstart and Configuration.
  • Configuration management is powered by Durango; see the Configuration guide for precedence details.

Contributing

We welcome issues and pull requests. See docs/development/contributing.md for environment setup, pre‑commit hooks, and CI guidance.

Local Workflow Helpers

This repository includes Invoke tasks that wrap our uv commands. After installing dependencies, run:

uv run invoke --list

Common tasks include:

  • uv run invoke sync — update the virtual environment (installs dev and docs extras by default).
  • uv run invoke ci — replicate the CI pipeline locally (lint, mypy, tests, docs).
  • uv run invoke docs-serve — launch the MkDocs server for live documentation previews.

License

Released under the MIT License. See LICENSE for details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

dorgy-0.5.0.tar.gz (1.2 MB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

dorgy-0.5.0-py3-none-any.whl (129.9 kB view details)

Uploaded Python 3

File details

Details for the file dorgy-0.5.0.tar.gz.

File metadata

  • Download URL: dorgy-0.5.0.tar.gz
  • Upload date:
  • Size: 1.2 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.7.19

File hashes

Hashes for dorgy-0.5.0.tar.gz
Algorithm Hash digest
SHA256 c9ce34a8f196c5b4cfa57d3b4a3d0378383370a9223843e7295920eb30e0c928
MD5 bd172418df43f7a22586b6ae932d2993
BLAKE2b-256 3f3efecd6ed0c2bb779f2fed7373905c40c847c6a382d12d1397c764d3689061

See more details on using hashes here.

File details

Details for the file dorgy-0.5.0-py3-none-any.whl.

File metadata

  • Download URL: dorgy-0.5.0-py3-none-any.whl
  • Upload date:
  • Size: 129.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.7.19

File hashes

Hashes for dorgy-0.5.0-py3-none-any.whl
Algorithm Hash digest
SHA256 f708879d13c44155db81c5f4dab97da7f6fe32d98258c594d54bec6fe8f3d4af
MD5 65db0020b79001874ce25b4cecf472bc
BLAKE2b-256 9c025062f01ded24a84b609554e455b3a04d0fe0376502a23007f63cb6b7f567

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page