A local read-only AI coding assistant
Project description
FileChat
FileChat is an AI assistant designed to help users understand and improve their local projects. It allows you to chat about files in your local folder while maintaining full control over your code.
FileChat is still quite new and under intense development. Expect bugs! If you find some or if you have a feature suggestion, please create an issue
Here is a short video:
Features
- Project Indexing: Creates a searchable index of your project files
- Contextual Chat: Ask questions about your project with AI that understands your codebase
- Real-time Updates: Automatically detects and indexes file changes
- Chat History: ChatGPT-like chat history for each directory
- Configurable: Customize which files to index, and choose your own LLM provider. We currently support models from:
- Mistral AI
- OpenAI
- Self-hosted servers with OpenAI-compatible API like Ollama or llama.cpp. We recommend a context window of at least 16384.
Installation
Prerequisites
- Python 3.12 or higher
- An API key for the LLM provider you want to use or access to a self-hosted LLM server with an OpenAI-compatible API
- On Windows, you need Visual C++ Redistributable. It's very likely you have it already installed on your machine.
Option 1: Install from PyPI
You can use any Package management tool you like. Here is an example for pip:
pip install filechat
And here is an example of installing FileChat as a UV tool:
uv tool install filechat
On Linux, you should also specify the hardware accelerator as an optional dependency.
This accelerator will be used to run the local embedding model.
We support xpu (Intel Arc), and cuda.
If you don't specify a hardware accelerator, the embedding model will run on a CPU.
Here is an example of installing FileChat with xpu support:
PIP:
pip install filechat[xpu]
UV Tool:
uv tool install filechat[xpu]
Option 2: Clone the repository and use UV
- Clone the repository:
git clone https://github.com/msvana/filechat
cd filechat
- Install dependencies using
uv:
uv sync
- (Optional) Install GPU support:
# CUDA (NVIDIA)
uv sync --extra cuda
# XPU (Intel Arc)
uv sync --extra xpu
Usage
filechat /path/to/your/project
Configuration
On the first run, FileChat guides you through an initial setup where you will choose your LLM provider, select a model, and set an API key.
These settings will be then stored at ~/.config/filechat.json. Feel free to change the file as you need.
You can invoke the initial setup at any time by running FileChat with the --setup or -s flag.
You can make FileChat use a different config file path by using the --config or -c argument.
Here is an example of a valid config file:
{
"max_file_size_kb": 25,
"ignored_dirs": [".git", "__pycache__", ".venv", ".pytest_cache", "node_modules", "dist"],
"allowed_suffixes": [".md", ".txt", ".json", ".toml", ".html", ".css", ...],
"index_store_path": "/home/milos/.cache/filechat",
"model": {
"provider": "openai",
"model": "gpt-5-mini",
"api_key": "[VALID_OPENAI_API_KEY]",
"base_url": null
}
}
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file filechat-0.4.0.tar.gz.
File metadata
- Download URL: filechat-0.4.0.tar.gz
- Upload date:
- Size: 62.5 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: uv/0.8.11
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
f7bc836b660500e5e6e5db5e3b5b39d5e937472f46d0e98e6bdd64b48f23a3e6
|
|
| MD5 |
0baa1a35d0825ec5d4d57fda15f92c44
|
|
| BLAKE2b-256 |
692042c136f105caced96844f868cb0af580e322f4143990e426779244d23bfa
|
File details
Details for the file filechat-0.4.0-py3-none-any.whl.
File metadata
- Download URL: filechat-0.4.0-py3-none-any.whl
- Upload date:
- Size: 17.3 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: uv/0.8.11
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
46226549441688b9b94e9660b8ea8a7184a4571e419fe68b81b4a72237291e16
|
|
| MD5 |
a1246777263a5f4b47f59faed89bf6d4
|
|
| BLAKE2b-256 |
0644bca87870d3b50f885a542730f4f9793d5ec8d5dbe63e2ec74442d5c5bd3f
|