A CLI tool for chatting with multiple LLM providers
Project description
llm-chat-cli
A CLI tool for chatting with multiple LLM providers in one place.
Install
pip install llm-chat-cli
Setup
First time setup:
lmci setup
This will prompt you for your API keys and create a config file.
Usage
Start chatting:
lmci
Supported Providers
- Groq
- OpenAI
- Anthropic
- Cerebras
Commands
change model- Switch modelstoken count- Show tokens usedclear history- Clear chat historyquit/exit- Exit
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
llm_chat_cli-0.0.5.tar.gz
(8.7 kB
view details)
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file llm_chat_cli-0.0.5.tar.gz.
File metadata
- Download URL: llm_chat_cli-0.0.5.tar.gz
- Upload date:
- Size: 8.7 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.9.21
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
a10569c162aa3f1ba59744e3a431609b60158b91e3b09222776bfe9409e82ae1
|
|
| MD5 |
514597756913205ad6b6d683b5dcd9b3
|
|
| BLAKE2b-256 |
1f223df9021cadd814ad1666fb0c741e6020214f0bb7eb7db07a3fca61c2b40d
|
File details
Details for the file llm_chat_cli-0.0.5-py3-none-any.whl.
File metadata
- Download URL: llm_chat_cli-0.0.5-py3-none-any.whl
- Upload date:
- Size: 9.9 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.9.21
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
97b4ac9d01673e6f83fc8120714d971a734a09ef9c138815c31f4efa55c265ca
|
|
| MD5 |
4062b0a2be2d82bb13fbfd7ae2a9c023
|
|
| BLAKE2b-256 |
490e12c09dfbb274d4dd581556c8b238d28152c943670c3948836e554c41d1cc
|