simple llm based tools to access from cli
Project description
LLM-cli
A lightweight Command Line Interface (CLI) for interacting with Large Language Models (LLMs) using LiteLLM.
See IMPLEMENTED_FEATURES.md for the current feature guide based on the code that is actually implemented in this repository.
💡 Why This Project?
Sometimes network constraints or data limitations make it difficult to access large language models via web interfaces. This CLI provides a lightweight, flexible solution for LLM interactions directly from the terminal.
🚀 Features
- Simple CLI Interface: Easily chat with different LLMs from your terminal
- Input: Pipe inputs or redirect file text.
- Multiple Chat Modes:
- Direct single-message chat
- Interactive chat UI with markdown rendering
- Image support for vision-capable models
- Flexible Configuration: Customize model, temperature, and system prompts
- Easy Configuration Management: Update settings with a simple command
- Sessions : Logs chat sessions, can be resumed saved chat later.
🔧 Prerequisites
- Api keys to the llms, set api keys as environment variables
💾 Installation
- Via Pip
pip install llm-to-cli
Or
- From Repo
# Clone the repository
git clone https://github.com/tikendraw/llm-cli.git
cd llm-cli
# Install
pip install .
🖥️ Usage
Basic Chat
-
Send a single message to an LLM:
llm-cli chat "Hello, how are you?"
-
Pipe input
echo "what is 34th prime number" | llm-cli chat
-
File redirection
llm-chat chat < some_file_with_question.txt
-
Include the last terminal command/output blocks from the current tmux pane
llm-cli chat --pane-history 1 "Why did this command fail?"
-
Target a different tmux pane explicitly
llm-cli chat --pane-history 3 --pane-target %12 "Summarize what just happened"
Interactive Chat UI
-
Start an interactive chat session:
llm-cli chatui -
Start with recent tmux pane history as context:
llm-cli chatui --pane-history 2
-
During chat, add pane history on demand:
/pane 3 /pane 2 %12
Image Support
- Add image
llm-cli chat --image path/to/image/or/url
Configuration
-
View current configuration:
llm-cli config -
Update configuration:
llm-cli config model "anthropic/claude-3-haiku" llm-cli config temperature 0.7
🛠️ Commands
chat: Send a single messagechatui: Interactive chatconfig: Manage CLI configurationhistory: See and manage history
🤝 Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file llm_to_cli-0.2.1.tar.gz.
File metadata
- Download URL: llm_to_cli-0.2.1.tar.gz
- Upload date:
- Size: 26.2 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/2.3.4 CPython/3.11.15 Linux/6.17.0-1010-azure
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
7bed67b7766a6fa377de40e12fef24078854c2be25c7a6e39b0662bd9641ace7
|
|
| MD5 |
a7b4f614108e34585056c20f4e696754
|
|
| BLAKE2b-256 |
d767e8731a08fd5c05db2e5b96cdcd1c601f2da31d74424dd2f08772e113fdb9
|
File details
Details for the file llm_to_cli-0.2.1-py3-none-any.whl.
File metadata
- Download URL: llm_to_cli-0.2.1-py3-none-any.whl
- Upload date:
- Size: 24.9 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/2.3.4 CPython/3.11.15 Linux/6.17.0-1010-azure
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
f448bba97413e675de959d62931bd50bce1b23a5a03516c22a80f83f782aeb4f
|
|
| MD5 |
765c8401ba69f61a9d0689471d71bc41
|
|
| BLAKE2b-256 |
f9978360aab7bb92364bb24d01e4e7c93c96555eb2962d88a94c06798bd2200c
|