Chat with your current directory's files using a local or API LLM.
Project description
dir-assistant
Chat with your current directory's files using a local or API LLM.
Summary
dir-assistant is a CLI python application available through pip that recursively indexes all text
files in the current working directory so you can chat with them using a local or API LLM. By
"chat with them", it is meant that their contents will automatically be included in the prompts sent
to the LLM, with the most contextually relevant files included first. dir-assistant is designed
primarily for use as a coding aid and automation tool.
Use Cases
This tool is primarily aimed at developers and technical users who need to:
- Quickly understand a large or unfamiliar codebase.
- Get explanations for specific functions, classes, or modules.
- Ask high-level questions like "What is the main purpose of this application?".
- Perform targeted, highly accurate updates in large corpora of text/code.
- Automate file modifications, analysis, refactoring, or documentation tasks.
Features
- Includes an interactive chat mode and a single prompt non-interactive mode.
- When enabled, it will automatically make file updates and commit to git.
- Local platform support for CPU (OpenBLAS), Cuda, ROCm, Metal, Vulkan, and SYCL.
- API support for all major LLM APIs. More info in the LiteLLM Docs.
- Uses a unique method for finding the most important files to include when submitting your prompt to an LLM called CGRAG (Contextually Guided Retrieval-Augmented Generation). You can read this blog post for more information about how it works.
- Automatically optimizes prompts for context caching optimization to reduce cost and latency. Typical use cases have 50-90% cache hits.
New Features
- Switched from euclidean distance to cosine similarity for artifact relevancy filtering. When upgrading, you will need to run
dir-assistant clear. - Added
ARTIFACT_COSINE_CUTOFFandARTIFACT_COSINE_CGRAG_CUTOFFto exclude artifacts with low cosine similarity. - Updated support for the latest version of
llama-cpp-python.
Quickstart
In this section are recipes to run dir-assistant in basic capacity to get you started quickly.
Local Model
To get started locally, you can download a default llm model. Default configuration with this model requires 3GB of memory on most hardware. You will be able to adjust the configuration to fit higher or lower memory requirements. To run via CPU:
pip install dir-assistant[recommended]
dir-assistant models download-embed
dir-assistant models download-llm
cd directory/to/chat/with
dir-assistant
To run with hardware acceleration, use the platform subcommand:
...
dir-assistant platform cuda
cd directory/to/chat/with
dir-assistant
See which platforms are supported using -h:
dir-assistant platform -h
For Windows
It is not recommended to use dir-assistant directly with local LLMs on Windows. This is because
llama-cpp-python requires a C compiler for installation via pip, and setting one up is not
a trivial task on Windows like it is on other platforms. Instead, it is recommended to
use another LLM server such as LMStudio and configure dir-assistant to use it as
a custom API server. To do this, ensure you are installing dir-assistant without
the recommended dependencies:
pip install dir-assistant
Then configure dir-assistant to connect to your custom LLM API server:
Connecting to a Custom API Server
For instructions on setting up LMStudio to host an API, follow their guide:
https://lmstudio.ai/docs/app/api
For Ubuntu 24.04
pip3 has been replaced with pipx starting in Ubuntu 24.04.
pipx install dir-assistant[recommended]
...
dir-assistant platform cuda --pipx
Gemini
To get started using an API model, you can use Google Gemini 2.5 Flash, which is currently free. To begin, you need to sign up for Google AI Studio and create an API key. After you create your API key, enter the following commands:
pip install dir-assistant
dir-assistant setkey GEMINI_API_KEY xxxxxYOURAPIKEYHERExxxxx
cd directory/to/chat/with
dir-assistant
For Windows
Note: The Python.org installer is recommended for Windows. The Windows
Store installer does not add dir-assistant to your PATH so you will need to call it
with python -m dir_assistant if you decide to go that route.
pip install dir-assistant
dir-assistant setkey GEMINI_API_KEY xxxxxYOURAPIKEYHERExxxxx
cd directory/to/chat/with
dir-assistant
For Ubuntu 24.04
pip3 has been replaced with pipx starting in Ubuntu 24.04.
pipx install dir-assistant
dir-assistant setkey GEMINI_API_KEY xxxxxYOURAPIKEYHERExxxxx
cd directory/to/chat/with
dir-assistant
Claude
To get started quickly with Anthropic's Claude models:
- Obtain an API key from Anthropic.
- Install
dir-assistantand set your API key:pip install dir-assistant dir-assistant setkey ANTHROPIC_API_KEY xxxxxYOURAPIKEYHERExxxxx
- Configure
dir-assistantto use Claude. Open the config file withdir-assistant config openand make sure these settings are present:[DIR_ASSISTANT] ACTIVE_MODEL_IS_LOCAL = false LITELLM_MODEL_USES_SYSTEM_MESSAGE = true LITELLM_CONTEXT_SIZE = 200000 [DIR_ASSISTANT.LITELLM_COMPLETION_OPTIONS] model = "anthropic/claude-sonnet-4-5-20250929"
- Navigate to your project directory and run:
cd directory/to/chat/with dir-assistant
For Windows (Claude)
pip install dir-assistant
dir-assistant setkey ANTHROPIC_API_KEY xxxxxYOURAPIKEYHERExxxxx
# Then, configure the model as shown above using 'dir-assistant config open'
cd directory/to/chat/with
dir-assistant
For Ubuntu 24.04 (Claude)
pipx install dir-assistant
dir-assistant setkey ANTHROPIC_API_KEY xxxxxYOURAPIKEYHERExxxxx
# Then, configure the model as shown above using 'dir-assistant config open'
cd directory/to/chat/with
dir-assistant
OpenAI
To get started quickly with OpenAI's models:
- Obtain an API key from OpenAI.
- Install
dir-assistantand set your API key:pip install dir-assistant dir-assistant setkey OPENAI_API_KEY xxxxxYOURAPIKEYHERExxxxx
- Configure
dir-assistantto use an OpenAI model. Open the config file withdir-assistant config openand make sure these settings are present:[DIR_ASSISTANT] ACTIVE_MODEL_IS_LOCAL = false LITELLM_MODEL_USES_SYSTEM_MESSAGE = true LITELLM_CONTEXT_SIZE = 128000 [DIR_ASSISTANT.LITELLM_COMPLETION_OPTIONS] model = "gpt-5"
- Navigate to your project directory and run:
cd directory/to/chat/with dir-assistant
For Windows (OpenAI)
pip install dir-assistant
dir-assistant setkey OPENAI_API_KEY xxxxxYOURAPIKEYHERExxxxx
# Then, configure the model as shown above using 'dir-assistant config open'
cd directory/to/chat/with
dir-assistant
For Ubuntu 24.04 (OpenAI)
pipx install dir-assistant
dir-assistant setkey OPENAI_API_KEY xxxxxYOURAPIKEYHERExxxxx
# Then, configure the model as shown above using 'dir-assistant config open'
cd directory/to/chat/with
dir-assistant
Automation Usage
The non-interactive mode of dir-assistant allows you to create scripts which analyze
your files without user interaction.
To get started using an API model, you can use Google Gemini 1.5 Flash, which is currently free.
To begin, you need to sign up for Google AI Studio and
create an API key. After you create your API key,
enter the following commands:
pip install dir-assistant
dir-assistant setkey GEMINI_API_KEY xxxxxYOURAPIKEYHERExxxxx
cd directory/to/chat/with
dir-assistant -s "Describe the files in this directory"
For Ubuntu 24.04
pip3 has been replaced with pipx starting in Ubuntu 24.04.
pipx install dir-assistant
dir-assistant setkey GEMINI_API_KEY xxxxxYOURAPIKEYHERExxxxx
cd directory/to/chat/with
dir-assistant -s "Describe the files in this directory"
Other Models
Dir-assistant supports almost every local and API model. Almost all local GGUF models (except the bleeding edge) are supported via embedded llama-cpp-python integration. Almost all API models are supported via LiteLLM integration, including generic OpenAI-compatible APIs like local servers. To learn how to use the model of your choice, view the configuration docs.
Detailed Documentation
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file dir_assistant-1.9.1.tar.gz.
File metadata
- Download URL: dir_assistant-1.9.1.tar.gz
- Upload date:
- Size: 37.9 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.5
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
7f74192e5a8151dee51bf910dcdcb4a7f08a135e61ef443add10ec5cad138255
|
|
| MD5 |
2aead33bf8d13d3a517a7f9b855e2a87
|
|
| BLAKE2b-256 |
7ce5031f3394e6e6478235bf2b92f26ccb1488c793d26c5dd18498fbdc49676d
|
File details
Details for the file dir_assistant-1.9.1-py3-none-any.whl.
File metadata
- Download URL: dir_assistant-1.9.1-py3-none-any.whl
- Upload date:
- Size: 45.2 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.5
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
673697aa352a70c3aa33a12190b1fcc02fc1b891d2fe3573e36b55d30fd9904b
|
|
| MD5 |
890071e3f4609e3150df32744732707c
|
|
| BLAKE2b-256 |
c8c0f99379a3d56a3a46d614e96c0397e5a069a38ff210409ad543b01f1d6580
|