Skip to main content

A CLI tool to chat with LLM models including GPT and Claude.

Project description

fire-chat

Overview

This project provides a command-line interface (CLI) for interacting with various large language models (LLMs) using the LiteLLM wrapper. It supports multiple providers, including OpenAI, Anthropic, Azure, and Gemini. The CLI allows users to chat with these models, manage budgets, and handle API keys efficiently.

Configuration

The configuration is managed through a $HOME/.config/fire-chat/config.yaml file. The first time you run the CLI run. You can copy paste the starting config file config.yaml to the location, adds your API key, and quick start the application fire-chat.

Installation and Usage

  1. Install the CLI:

    pip install --user fire-chat # requires python 3.10+
    
  2. Configure the CLI:

    Edit the $HOME/.config/fire-chat/config.yaml file to set your preferred provider, model, and other settings.

  3. Run the CLI:

    fire-chat
    

    or run with arguments (overriding config yaml file)

    fire-chat --model=gpt-4o
    

    for full list of configs, see main.py.

  4. Exit: To exit the CLI, Ctrl+C.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

fire_chat-0.0.6.dev14228.tar.gz (86.1 kB view details)

Uploaded Source

Built Distribution

fire_chat-0.0.6.dev14228-py3-none-any.whl (17.7 kB view details)

Uploaded Python 3

File details

Details for the file fire_chat-0.0.6.dev14228.tar.gz.

File metadata

File hashes

Hashes for fire_chat-0.0.6.dev14228.tar.gz
Algorithm Hash digest
SHA256 81e4df92b6741720fa6979f96a196d10b0c4cb41b339a3d43aa817965819d543
MD5 4ddefd631fa5686b8c338e22b305de92
BLAKE2b-256 2d512558c4f6a4ec473e5bcdc9a13359282358df35148ed8d6f5ae7293eb3514

See more details on using hashes here.

File details

Details for the file fire_chat-0.0.6.dev14228-py3-none-any.whl.

File metadata

File hashes

Hashes for fire_chat-0.0.6.dev14228-py3-none-any.whl
Algorithm Hash digest
SHA256 912e87497f2f5630ac3dcc8bfb9bcbf6b83cdc5328f2c87f9f24e41434495d77
MD5 ac173fac9ddbdf74715eba6429220b02
BLAKE2b-256 bf8801670dea0979453ef9ee0660737d937aa7984680e495414660bfcf13d515

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page