Skip to main content

A local read-only AI coding assistant

Project description

FileChat

FileChat is an AI assistant designed to help users understand and improve their local projects. It allows you to chat about files in your local folder while maintaining full control over your code.

Here is a short video:

https://github.com/user-attachments/assets/dd3c6617-b141-47ab-926e-c62abcc7b4a6

Features

  • Project Indexing: Creates a searchable index of your project files
  • Contextual Chat: Ask questions about your project with AI that understands your codebase
  • Real-time Updates: Automatically detects and indexes file changes
  • Chat History: ChatGPT-like chat history for each directory
  • Configurable: Customize which files to index, and choose your own LLM provider. We currently support models from:

Installation

Prerequisites

  • Python 3.12 or higher
  • An API key for the LLM provider you want to use or access to a self-hosted LLM server with an OpenAI-compatible API
  • On Windows, you need Visual C++ Redistributable. It's very likely you have it already installed on your machine.

Option 1: Install from PyPI

You can use any Package management tool you like. Here is an example for pip:

pip install filechat

And here is an example of installing FileChat as a UV tool:

uv tool install filechat

On Linux, you should also specify the hardware accelerator as an optional dependency. This accelerator will be used to run the local embedding model. We support xpu (Intel Arc), and cuda. If you don't specify a hardware accelerator, the embedding model will run on a CPU. Here is an example of installing FileChat with xpu support:

PIP:

pip install filechat[xpu]

UV Tool:

uv tool install filechat[xpu]

Option 2: Clone the repository and use UV

  1. Clone the repository:
git clone https://github.com/msvana/filechat
cd filechat
  1. Install dependencies using uv:
uv sync
  1. (Optional) Install GPU support:
# CUDA (NVIDIA)
uv sync --extra cuda

# XPU (Intel Arc)
uv sync --extra xpu

Usage

filechat /path/to/your/project

Configuration

On the first run, FileChat guides you through an initial setup where you will choose your LLM provider, select a model, and set an API key. These settings will be then stored at ~/.config/filechat.json. Feel free to change the file as you need.

You can invoke the initial setup at any time by running FileChat with the --setup or -s flag. You can make FileChat use a different config file path by using the --config or -c argument.

Here is an example of a valid config file:

{
    "max_file_size_kb": 25,
    "ignored_dirs": [".git", "__pycache__", ".venv", ".pytest_cache", "node_modules", "dist"],
    "allowed_suffixes": [".md", ".txt", ".json", ".toml", ".html", ".css", ...],
    "index_store_path": "/home/milos/.cache/filechat",
    "model": {
        "provider": "openai",
        "model": "gpt-5-mini",
        "api_key": "[VALID_OPENAI_API_KEY]",
        "base_url": null
    }
}

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

filechat-0.3.0.tar.gz (56.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

filechat-0.3.0-py3-none-any.whl (14.6 kB view details)

Uploaded Python 3

File details

Details for the file filechat-0.3.0.tar.gz.

File metadata

  • Download URL: filechat-0.3.0.tar.gz
  • Upload date:
  • Size: 56.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.8.11

File hashes

Hashes for filechat-0.3.0.tar.gz
Algorithm Hash digest
SHA256 e8d1fe148878a23b98c0b5b548cb1842f828d957d29139aea0951f4e4c721fe3
MD5 0a49efaceb0b93f77df0b0ef64fc3e0b
BLAKE2b-256 0583491b12f6e7c62076c087c86e594b5456474e95a055a404405ca324f147ef

See more details on using hashes here.

File details

Details for the file filechat-0.3.0-py3-none-any.whl.

File metadata

  • Download URL: filechat-0.3.0-py3-none-any.whl
  • Upload date:
  • Size: 14.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.8.11

File hashes

Hashes for filechat-0.3.0-py3-none-any.whl
Algorithm Hash digest
SHA256 f8353f72cdf30aad96e5526a1ebb514f80df6a8f2f56c7826f2f29d2b7dbcd09
MD5 e3a782aa892c6b16a3a9c27fb4122685
BLAKE2b-256 5e13ea79bb14d8af6d59ab826594f59ecfc76d07509e33ecfb676ca9a5f5e58f

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page