A powerful command-line interface for interacting with AI models and generating images. Features multiple AI providers, streaming responses, and a colorful interface.
Project description
OmniChat CLI 🤖
A powerful command-line interface for interacting with AI models and generating images. Features multiple AI providers, streaming responses, and a colorful interface.
✨ Features
AI Providers
- 🧠 OpenAI (
gpt-3.5-turbo) - 🚀 Groq (
llama3-8b-8192) - 🎨 Image Generation
Core Features
- 📡 Real-time streaming responses
- 🌈 Colorful, emoji-enhanced interface
- 💾 Chat history saving with custom filenames
- 📁 Custom save locations
- 📊 Multiple export formats
- JSON (structured data)
- PDF (formatted document)
- Markdown (human-readable)
- ⚙️ Configurable parameters
- Temperature control
- Token limits
- Model selection
- Response streaming
- Export format selection
Interface
- 👤 User messages in yellow
- 🤖 Assistant responses in blue
- 🎨 Image generation with custom prompts
- 🚦 Color-coded status messages
- 📝 Easy-to-read format with emoji indicators
🛠️ Prerequisites
- Python 3.10+
- Required API Keys:
- OpenAI API key
- Groq API key
- Image Generation API URL
🌟 Image Demo
📦 Installation
- Clone the repository:
git clone https://github.com/Amul-Thantharate/OmniChat-Cli.git
cd OmniChat-Cli
- Install the package:
pip install -e .
- Configure API Keys:
Method A: Environment Variables
# Windows (Command Prompt)
set OPENAI_API_KEY=your_openai_api_key
set GROQ_API_KEY=your_groq_api_key
set APP_URL=your_image_generation_api_url
# Windows (PowerShell)
$env:OPENAI_API_KEY="your_openai_api_key"
$env:GROQ_API_KEY="your_groq_api_key"
$env:APP_URL="your_image_generation_api_url"
# Linux/Mac
export OPENAI_API_KEY="your_openai_api_key"
export GROQ_API_KEY="your_groq_api_key"
export APP_URL="your_image_generation_api_url"
Method B: .env File
Create a .env file in the project root:
OPENAI_API_KEY=your_openai_api_key
GROQ_API_KEY=your_groq_api_key
APP_URL=your_image_generation_api_url
🚀 Usage
Quick Start
- Chat with OpenAI (Default):
omenicli
- Chat with Groq:
omenicli --model-type groq
- Generate Images:
omenicli --model-type image
Advanced Usage
- Stream Responses:
omenicli --stream
- Custom Model Settings:
omenicli --temperature 0.7 --max-tokens 2048
- Save Chat History:
omenicli --save
- Custom Image Directory:
omenicli --model-type image --image-dir my_images
- Combined Features:
omenicli --model-type image --save --image-dir my_images --stream
📚 Example
Refer to the Example section.
⚙️ Configuration Options
| Option | Short | Default | Description |
|---|---|---|---|
| --model-type | -mt | openai | Model provider (openai/groq/image) |
| --temperature | -T | 0.5 | Response randomness (0-1) |
| --max-tokens | -M | 1024 | Maximum response length |
| --stream | -S | False | Enable streaming responses |
| --save | -s | False | Save chat history |
| --openai-model | -o | gpt-3.5-turbo | OpenAI model name |
| --groq-model | -g | llama3-8b-8192 | Groq model name |
| --image-dir | -i | generated_images | Directory to save generated images |
| --export-format | -e | json | Export format (json/pdf/markdown) |
💾 Data Management
Chat History
- Default Location:
chat_history/ - Format: JSON files
- Custom Options:
- Directory: Enter custom path when prompted
- Filename: Enter custom name when prompted
- Default naming:
chat_[model-type]_[timestamp].json
Generated Images
- Default Location:
generated_images/ - Format: PNG files with timestamps
- Custom Locations:
- Via CLI:
--image-dir path/to/directory - Via Prompt: Enter path when asked
- Via CLI:
- Naming:
image_[timestamp].png
🔧 Error Handling
The application handles various scenarios:
- Missing/Invalid API keys
- Network connectivity issues
- Rate limiting
- Invalid configurations
- File system errors
🤝 Contributing
Feel free to:
- Open issues
- Submit pull requests
- Suggest improvements
- Report bugs
📝 License
🔍 Tips & Tricks
-
Chat Mode:
- Use "exit" to end the session
- Try different temperatures for varied responses
- Enable streaming for real-time responses
- Use custom filenames for better organization
-
Image Generation:
- Be specific in your prompts
- Use custom directories for organization
- Combine with chat history saving
-
Chat History:
- Use descriptive filenames
- Organize by project/topic
- Use custom paths for better file management
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file omenicli-0.2.tar.gz.
File metadata
- Download URL: omenicli-0.2.tar.gz
- Upload date:
- Size: 9.3 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.1 CPython/3.12.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
011121ab6a3366da7464556c3885738559fc3373646fd3b0de0d2815f4e0933b
|
|
| MD5 |
0c75be262feb3def0322a16e26b5ec7d
|
|
| BLAKE2b-256 |
79d1e36c522a1191480354f78187a0b8e9f73524ffc58b592233aaed096be6f0
|
File details
Details for the file omenicli-0.2-py3-none-any.whl.
File metadata
- Download URL: omenicli-0.2-py3-none-any.whl
- Upload date:
- Size: 8.0 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.1 CPython/3.12.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
85b734f2c658227ee626c0c0ddefbb6c9a8991907ecb36822de910e8d87b7f92
|
|
| MD5 |
83dd3546095899963a77d3ce16e97a37
|
|
| BLAKE2b-256 |
666471f4cc68763bb007868fda18bca78a60e5569dfc1d12a06db95245b0a5f5
|