Skip to main content

A powerful command-line interface for interacting with AI models and generating images. Features multiple AI providers, streaming responses, and a colorful interface.

Project description

OmniChat CLI 🤖

A powerful command-line interface for interacting with AI models and generating images. Features multiple AI providers, streaming responses, and a colorful interface.

✨ Features

AI Providers

  • 🧠 OpenAI (gpt-3.5-turbo)
  • 🚀 Groq (llama3-8b-8192)
  • 🎨 Image Generation

Core Features

  • 📡 Real-time streaming responses
  • 🌈 Colorful, emoji-enhanced interface
  • 💾 Chat history saving with custom filenames
  • 📁 Custom save locations
  • 📊 Multiple export formats
    • JSON (structured data)
    • PDF (formatted document)
    • Markdown (human-readable)
  • ⚙️ Configurable parameters
    • Temperature control
    • Token limits
    • Model selection
    • Response streaming
    • Export format selection

Interface

  • 👤 User messages in yellow
  • 🤖 Assistant responses in blue
  • 🎨 Image generation with custom prompts
  • 🚦 Color-coded status messages
  • 📝 Easy-to-read format with emoji indicators

🛠️ Prerequisites

  • Python 3.10+
  • Required API Keys:
    • OpenAI API key
    • Groq API key
    • Image Generation API URL

🌟 Image Demo

Image Demo

📦 Installation

  1. Clone the repository:
git clone https://github.com/Amul-Thantharate/OmniChat-Cli.git
cd OmniChat-Cli
  1. Install the package:
pip install -e .
  1. Configure API Keys:

Method A: Environment Variables

# Windows (Command Prompt)
set OPENAI_API_KEY=your_openai_api_key
set GROQ_API_KEY=your_groq_api_key
set APP_URL=your_image_generation_api_url

# Windows (PowerShell)
$env:OPENAI_API_KEY="your_openai_api_key"
$env:GROQ_API_KEY="your_groq_api_key"
$env:APP_URL="your_image_generation_api_url"

# Linux/Mac
export OPENAI_API_KEY="your_openai_api_key"
export GROQ_API_KEY="your_groq_api_key"
export APP_URL="your_image_generation_api_url"

Method B: .env File

Create a .env file in the project root:

OPENAI_API_KEY=your_openai_api_key
GROQ_API_KEY=your_groq_api_key
APP_URL=your_image_generation_api_url

🚀 Usage

Quick Start

  1. Chat with OpenAI (Default):
omenicli
  1. Chat with Groq:
omenicli --model-type groq
  1. Generate Images:
omenicli --model-type image

Advanced Usage

  1. Stream Responses:
omenicli --stream
  1. Custom Model Settings:
omenicli --temperature 0.7 --max-tokens 2048
  1. Save Chat History:
omenicli --save
  1. Custom Image Directory:
omenicli --model-type image --image-dir my_images
  1. Combined Features:
omenicli --model-type image --save --image-dir my_images --stream

📚 Example

Refer to the Example section.

⚙️ Configuration Options

Option Short Default Description
--model-type -mt openai Model provider (openai/groq/image)
--temperature -T 0.5 Response randomness (0-1)
--max-tokens -M 1024 Maximum response length
--stream -S False Enable streaming responses
--save -s False Save chat history
--openai-model -o gpt-3.5-turbo OpenAI model name
--groq-model -g llama3-8b-8192 Groq model name
--image-dir -i generated_images Directory to save generated images
--export-format -e json Export format (json/pdf/markdown)

💾 Data Management

Chat History

  • Default Location: chat_history/
  • Format: JSON files
  • Custom Options:
    • Directory: Enter custom path when prompted
    • Filename: Enter custom name when prompted
    • Default naming: chat_[model-type]_[timestamp].json

Generated Images

  • Default Location: generated_images/
  • Format: PNG files with timestamps
  • Custom Locations:
    • Via CLI: --image-dir path/to/directory
    • Via Prompt: Enter path when asked
  • Naming: image_[timestamp].png

🔧 Error Handling

The application handles various scenarios:

  • Missing/Invalid API keys
  • Network connectivity issues
  • Rate limiting
  • Invalid configurations
  • File system errors

🤝 Contributing

Feel free to:

  • Open issues
  • Submit pull requests
  • Suggest improvements
  • Report bugs

📝 License

MIT License

🔍 Tips & Tricks

  1. Chat Mode:

    • Use "exit" to end the session
    • Try different temperatures for varied responses
    • Enable streaming for real-time responses
    • Use custom filenames for better organization
  2. Image Generation:

    • Be specific in your prompts
    • Use custom directories for organization
    • Combine with chat history saving
  3. Chat History:

    • Use descriptive filenames
    • Organize by project/topic
    • Use custom paths for better file management

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

omenicli-0.2.2.tar.gz (9.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

omenicli-0.2.2-py3-none-any.whl (8.2 kB view details)

Uploaded Python 3

File details

Details for the file omenicli-0.2.2.tar.gz.

File metadata

  • Download URL: omenicli-0.2.2.tar.gz
  • Upload date:
  • Size: 9.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.12.7

File hashes

Hashes for omenicli-0.2.2.tar.gz
Algorithm Hash digest
SHA256 9afc7bc0887d9997389a959a456ada9d476f0abf371fa9171901b5d18dfef5ac
MD5 0dc48dcf20d23c70e24fc24a0a64a66d
BLAKE2b-256 43e4616bb27b5f87e8d1dbb091282947be43c185d1e92e8ebd6dcca9c6497951

See more details on using hashes here.

File details

Details for the file omenicli-0.2.2-py3-none-any.whl.

File metadata

  • Download URL: omenicli-0.2.2-py3-none-any.whl
  • Upload date:
  • Size: 8.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.12.7

File hashes

Hashes for omenicli-0.2.2-py3-none-any.whl
Algorithm Hash digest
SHA256 09db7b066b7f2b4024a903071241e04e1940b3efcb6a441ef84fc8c82ed7814b
MD5 98795e0162bba8da7fa15e466784461a
BLAKE2b-256 54456dc71c6010038597cbae892b9c0a82b3c2c542888fe01c6437c2031d38a3

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page