A CLI tool for chatting with multiple LLM providers
Project description
llm-chat-cli
A CLI tool for chatting with multiple LLM providers in one place.
Install
pip install llm-chat-cli
Setup
First time setup:
lmci setup
This will prompt you for your API keys and create a config file.
Usage
Start chatting:
lmci
Supported Providers
- Groq
- OpenAI
- Anthropic
- Cerebras
Commands
change model- Switch modelstoken count- Show tokens usedclear history- Clear chat historyquit/exit- Exit
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
llm_chat_cli-0.0.4.tar.gz
(6.3 kB
view details)
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file llm_chat_cli-0.0.4.tar.gz.
File metadata
- Download URL: llm_chat_cli-0.0.4.tar.gz
- Upload date:
- Size: 6.3 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.0.1 CPython/3.9.21
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
324e308a13362675dd0a9888d1d95c2b682001458ea511a95580b3a6cbc0baf3
|
|
| MD5 |
681ed1e8257d9e7640fb09f0368e5817
|
|
| BLAKE2b-256 |
86b87e934648ec090859724150f16d478265a426c87d809fb0072a5dabb41b36
|
File details
Details for the file llm_chat_cli-0.0.4-py3-none-any.whl.
File metadata
- Download URL: llm_chat_cli-0.0.4-py3-none-any.whl
- Upload date:
- Size: 7.3 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.0.1 CPython/3.9.21
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
8fd4b5ab04cb563e08da6c9a238cb71d578f564f1fba0b076abaebe837bc0a52
|
|
| MD5 |
6f962e4b7dbc41197e33d9d8fdf23d3c
|
|
| BLAKE2b-256 |
afe6a4a7d273ffe0f29394622760559030a897f069d41675173b0ecfd7558f20
|