Skip to main content

A powerful command-line interface for interacting with AI models and generating images. Features multiple AI providers, streaming responses, and a colorful interface.

Project description

OmniChat CLI 🤖

A powerful command-line interface for interacting with AI models and generating images. Features multiple AI providers, streaming responses, and a colorful interface.

✨ Features

AI Providers

  • 🧠 OpenAI (gpt-3.5-turbo)
  • 🚀 Groq (llama3-8b-8192)
  • 🎨 Image Generation

Core Features

  • 📡 Real-time streaming responses
  • 🌈 Colorful, emoji-enhanced interface
  • 💾 Chat history saving with custom filenames
  • 📁 Custom save locations
  • ⚙️ Configurable parameters
    • Temperature control
    • Token limits
    • Model selection
    • Response streaming

Interface

  • 👤 User messages in yellow
  • 🤖 Assistant responses in blue
  • 🎨 Image generation with custom prompts
  • 🚦 Color-coded status messages
  • 📝 Easy-to-read format with emoji indicators

🛠️ Prerequisites

  • Python 3.10+
  • Required API Keys:
    • OpenAI API key
    • Groq API key
    • Image Generation API URL

🌟 Image Demo

Image Demo

📦 Installation

  1. Clone the repository:
git clone https://github.com/Amul-Thantharate/OmniChat-Cli.git
cd OmniChat-Cli
  1. Install the package:
pip install -e .
  1. Configure API Keys:

Method A: Environment Variables

# Windows (Command Prompt)
set OPENAI_API_KEY=your_openai_api_key
set GROQ_API_KEY=your_groq_api_key
set APP_URL=your_image_generation_api_url

# Windows (PowerShell)
$env:OPENAI_API_KEY="your_openai_api_key"
$env:GROQ_API_KEY="your_groq_api_key"
$env:APP_URL="your_image_generation_api_url"

# Linux/Mac
export OPENAI_API_KEY="your_openai_api_key"
export GROQ_API_KEY="your_groq_api_key"
export APP_URL="your_image_generation_api_url"

Method B: .env File

Create a .env file in the project root:

OPENAI_API_KEY=your_openai_api_key
GROQ_API_KEY=your_groq_api_key
APP_URL=your_image_generation_api_url

🚀 Usage

Quick Start

  1. Chat with OpenAI (Default):
omenicli
  1. Chat with Groq:
omenicli --model-type groq
  1. Generate Images:
omenicli --model-type image

Advanced Usage

  1. Stream Responses:
omenicli --stream
  1. Custom Model Settings:
omenicli --temperature 0.7 --max-tokens 2048
  1. Save Chat History:
omenicli --save
  1. Custom Image Directory:
omenicli --model-type image --image-dir my_images
  1. Combined Features:
omenicli --model-type image --save --image-dir my_images --stream

📚 Example

Refer to the Example section.

⚙️ Configuration Options

Option Short Default Description
--model-type -mt openai Model provider (openai/groq/image)
--temperature -T 0.5 Response randomness (0-1)
--max-tokens -M 1024 Maximum response length
--stream -S False Enable streaming responses
--save -s False Save chat history
--openai-model -o gpt-3.5-turbo OpenAI model name
--groq-model -g llama3-8b-8192 Groq model name
--image-dir -i generated_images Directory to save generated images

💾 Data Management

Chat History

  • Default Location: chat_history/
  • Format: JSON files
  • Custom Options:
    • Directory: Enter custom path when prompted
    • Filename: Enter custom name when prompted
    • Default naming: chat_[model-type]_[timestamp].json

Generated Images

  • Default Location: generated_images/
  • Format: PNG files with timestamps
  • Custom Locations:
    • Via CLI: --image-dir path/to/directory
    • Via Prompt: Enter path when asked
  • Naming: image_[timestamp].png

🔧 Error Handling

The application handles various scenarios:

  • Missing/Invalid API keys
  • Network connectivity issues
  • Rate limiting
  • Invalid configurations
  • File system errors

🤝 Contributing

Feel free to:

  • Open issues
  • Submit pull requests
  • Suggest improvements
  • Report bugs

📝 License

MIT License

🔍 Tips & Tricks

  1. Chat Mode:

    • Use "exit" to end the session
    • Try different temperatures for varied responses
    • Enable streaming for real-time responses
    • Use custom filenames for better organization
  2. Image Generation:

    • Be specific in your prompts
    • Use custom directories for organization
    • Combine with chat history saving
  3. Chat History:

    • Use descriptive filenames
    • Organize by project/topic
    • Use custom paths for better file management

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

omenicli-0.1.1.tar.gz (8.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

omenicli-0.1.1-py3-none-any.whl (7.3 kB view details)

Uploaded Python 3

File details

Details for the file omenicli-0.1.1.tar.gz.

File metadata

  • Download URL: omenicli-0.1.1.tar.gz
  • Upload date:
  • Size: 8.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.12.7

File hashes

Hashes for omenicli-0.1.1.tar.gz
Algorithm Hash digest
SHA256 63edaf1fdd787bbf1fd9220adff09ae552969284b867e61738ad0baa3f2e3c46
MD5 c44982538b51cf69f48c4ae7d08caf13
BLAKE2b-256 8fe4b7c0929baa0b8782fc21298fe7fdb0774e7ceb309362e342d4e858d10a88

See more details on using hashes here.

File details

Details for the file omenicli-0.1.1-py3-none-any.whl.

File metadata

  • Download URL: omenicli-0.1.1-py3-none-any.whl
  • Upload date:
  • Size: 7.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.12.7

File hashes

Hashes for omenicli-0.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 965482dbd350b059707649e87a1fd73d1317e8e53c3bc0cc0402886e176faaf0
MD5 bf3d9463a3a41eae7c41fcaeca7f91f9
BLAKE2b-256 a7a60fff62ba5c50d06342ab07e9caede720bd0f9f1f59e9e307478c640b2128

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page