Skip to main content

AI-assisted CLI for organizing files.

Project description

CI PyPI - Version PyPI - Python Version GitHub License

Dorgy

dorgy logo

AI‑assisted CLI to keep growing collections of files tidy. Organize folders with safe renames/moves and undo, watch directories for changes, and search collections with substring or semantic queries — all powered by portable per‑collection state.

What It Does

Before (a messy folder):

my_docs/
  IMG_0234.jpg
  Scan_001.pdf
  taxes.txt
  contract_final_FINAL.docx
  notes (1).txt
  2023-05-07 14.23.10.png
  invoice.pdf

After (organized by category/date with safe renames, hyphenated lower‑case folders):

my_docs/
  .dorgy/                     # state, history, search index, logs
  documents/
    contracts/
      Employment Agreement (2023-06-15).pdf
    taxes/
      2023/
        Tax Notes.txt
  photos/
    2023/05/
      2023-05-07 14-23-10.png
  invoices/
    2023/
      ACME - April.pdf

Exact destinations depend on your config and prompts; all moves are reversible via dorgy undo using the state in .dorgy.

Installation

PyPI (recommended)

pip install dorgy

From source (contributors)

git clone https://github.com/bryaneburr/dorgy.git
cd dorgy

# Optional: install dev dependencies
uv sync --extra dev

# Optional: editable install
uv pip install -e .

Getting Started

# Inspect available commands
dorgy --help

# Organize a directory in place (dry run first)
dorgy org ./documents --dry-run
dorgy org ./documents

# Monitor a directory and emit JSON batches
dorgy watch ./inbox --json --once

# Undo the latest plan
dorgy undo ./documents --dry-run
dorgy status ./documents --json

See the docs for guides on Organize, Watch, Search, Move/Undo, and configuration details.

Configuring LLM access

Set language model credentials and defaults via dorgy config commands or the YAML file at ~/.dorgy/config.yaml. Important fields include:

  • llm.model — full LiteLLM/DSPy model identifier (e.g., openai/gpt-4o-mini, openrouter/gpt-4.1).
  • llm.api_key — API token for the selected provider (keep this in environment variables for security, e.g., export DORGY__LLM__API_KEY=...).
  • llm.api_base_url — optional custom gateway URL (useful for openrouter, proxies, or self-hosted backends).
  • llm.temperature / llm.max_tokens — sampling parameters that shape response creativity and length.

To override values temporarily, export environment variables following the DORGY__SECTION__KEY scheme—for example:

export DORGY__LLM__MODEL="openai/gpt-4o-mini"
export DORGY__LLM__API_KEY="sk-example"
export DORGY__LLM__API_BASE_URL="https://api.openai.com/v1"

Then run CLI commands as usual (dorgy org, dorgy watch, etc.).

LLM Recommendations

We've tested dorgy with a number of LLMs and providers, and we've found the following to perform well:

  • GPT-5
  • Gemini 2.5
  • If you use OpenRouter, the openrouter/auto model is an interesting choice.

Documentation

  • Published site: https://bryaneburr.github.io/dorgy/
  • Source: docs/ (MkDocs + shadcn)
  • Start with Getting Started → Quickstart and Configuration.
  • Configuration management is powered by Durango; see the Configuration guide for precedence details.

Contributing

We welcome issues and pull requests. See docs/development/contributing.md for environment setup, pre‑commit hooks, and CI guidance.

Local Workflow Helpers

This repository includes Invoke tasks that wrap our uv commands. After installing dependencies, run:

uv run invoke --list

Common tasks include:

  • uv run invoke sync — update the virtual environment (installs dev and docs extras by default).
  • uv run invoke ci — replicate the CI pipeline locally (lint, mypy, tests, docs).
  • uv run invoke docs-serve — launch the MkDocs server for live documentation previews.

License

Released under the MIT License. See LICENSE for details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

dorgy-0.4.8.tar.gz (1.2 MB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

dorgy-0.4.8-py3-none-any.whl (129.9 kB view details)

Uploaded Python 3

File details

Details for the file dorgy-0.4.8.tar.gz.

File metadata

  • Download URL: dorgy-0.4.8.tar.gz
  • Upload date:
  • Size: 1.2 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.7.19

File hashes

Hashes for dorgy-0.4.8.tar.gz
Algorithm Hash digest
SHA256 212cac165f68b2a9b3c6b02ea9315cb1815f422d588762534091e066b5a9a703
MD5 5d6b10dc4e223eb491b868d769101441
BLAKE2b-256 e60bca194000dd83775873eb2de4dfdb31ac461669fce7d466ac7c63dbef5d43

See more details on using hashes here.

File details

Details for the file dorgy-0.4.8-py3-none-any.whl.

File metadata

  • Download URL: dorgy-0.4.8-py3-none-any.whl
  • Upload date:
  • Size: 129.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.7.19

File hashes

Hashes for dorgy-0.4.8-py3-none-any.whl
Algorithm Hash digest
SHA256 ef671965cae42bcf3ffafe93637491c2436383a49818921e23d78d8b43e0f33d
MD5 719c2c101b850ec2c8d68fafb743de8d
BLAKE2b-256 c18e46a965614d5c0996b36ea31b9594bdd70dbb84f962c1bed334864b3df62c

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page