Skip to main content

A powerful command-line interface for interacting with AI models and generating images. Features multiple AI providers, streaming responses, and a colorful interface.

Project description

OmniChat CLI 🤖

A powerful command-line interface for interacting with AI models and generating images. Features multiple AI providers, streaming responses, and a colorful interface.

✨ Features

AI Providers

  • 🧠 OpenAI (gpt-3.5-turbo)
  • 🚀 Groq (llama3-8b-8192)
  • 🎨 Image Generation

Core Features

  • 📡 Real-time streaming responses
  • 🌈 Colorful, emoji-enhanced interface
  • 💾 Chat history saving with custom filenames
  • 📁 Custom save locations
  • 📊 Multiple export formats
    • JSON (structured data)
    • PDF (formatted document)
    • Markdown (human-readable)
  • ⚙️ Configurable parameters
    • Temperature control
    • Token limits
    • Model selection
    • Response streaming
    • Export format selection

Interface

  • 👤 User messages in yellow
  • 🤖 Assistant responses in blue
  • 🎨 Image generation with custom prompts
  • 🚦 Color-coded status messages
  • 📝 Easy-to-read format with emoji indicators

🛠️ Prerequisites

  • Python 3.10+
  • Required API Keys:
    • OpenAI API key
    • Groq API key
    • Image Generation API URL

🌟 Image Demo

Image Demo

📦 Installation

  1. Clone the repository:
git clone https://github.com/Amul-Thantharate/OmniChat-Cli.git
cd OmniChat-Cli
  1. Install the package:
pip install -e .
  1. Configure API Keys:

Method A: Environment Variables

# Windows (Command Prompt)
set OPENAI_API_KEY=your_openai_api_key
set GROQ_API_KEY=your_groq_api_key
set APP_URL=your_image_generation_api_url

# Windows (PowerShell)
$env:OPENAI_API_KEY="your_openai_api_key"
$env:GROQ_API_KEY="your_groq_api_key"
$env:APP_URL="your_image_generation_api_url"

# Linux/Mac
export OPENAI_API_KEY="your_openai_api_key"
export GROQ_API_KEY="your_groq_api_key"
export APP_URL="your_image_generation_api_url"

Method B: .env File

Create a .env file in the project root:

OPENAI_API_KEY=your_openai_api_key
GROQ_API_KEY=your_groq_api_key
APP_URL=your_image_generation_api_url

🚀 Usage

Quick Start

  1. Chat with OpenAI (Default):
omenicli
  1. Chat with Groq:
omenicli --model-type groq
  1. Generate Images:
omenicli --model-type image

Advanced Usage

  1. Stream Responses:
omenicli --stream
  1. Custom Model Settings:
omenicli --temperature 0.7 --max-tokens 2048
  1. Save Chat History:
omenicli --save
  1. Custom Image Directory:
omenicli --model-type image --image-dir my_images
  1. Combined Features:
omenicli --model-type image --save --image-dir my_images --stream

📚 Example

Refer to the Example section.

⚙️ Configuration Options

Option Short Default Description
--model-type -mt openai Model provider (openai/groq/image)
--temperature -T 0.5 Response randomness (0-1)
--max-tokens -M 1024 Maximum response length
--stream -S False Enable streaming responses
--save -s False Save chat history
--openai-model -o gpt-3.5-turbo OpenAI model name
--groq-model -g llama3-8b-8192 Groq model name
--image-dir -i generated_images Directory to save generated images
--export-format -e json Export format (json/pdf/markdown)

💾 Data Management

Chat History

  • Default Location: chat_history/
  • Format: JSON files
  • Custom Options:
    • Directory: Enter custom path when prompted
    • Filename: Enter custom name when prompted
    • Default naming: chat_[model-type]_[timestamp].json

Generated Images

  • Default Location: generated_images/
  • Format: PNG files with timestamps
  • Custom Locations:
    • Via CLI: --image-dir path/to/directory
    • Via Prompt: Enter path when asked
  • Naming: image_[timestamp].png

🔧 Error Handling

The application handles various scenarios:

  • Missing/Invalid API keys
  • Network connectivity issues
  • Rate limiting
  • Invalid configurations
  • File system errors

🤝 Contributing

Feel free to:

  • Open issues
  • Submit pull requests
  • Suggest improvements
  • Report bugs

📝 License

MIT License

🔍 Tips & Tricks

  1. Chat Mode:

    • Use "exit" to end the session
    • Try different temperatures for varied responses
    • Enable streaming for real-time responses
    • Use custom filenames for better organization
  2. Image Generation:

    • Be specific in your prompts
    • Use custom directories for organization
    • Combine with chat history saving
  3. Chat History:

    • Use descriptive filenames
    • Organize by project/topic
    • Use custom paths for better file management

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

omenicli-0.2.1.tar.gz (9.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

omenicli-0.2.1-py3-none-any.whl (8.1 kB view details)

Uploaded Python 3

File details

Details for the file omenicli-0.2.1.tar.gz.

File metadata

  • Download URL: omenicli-0.2.1.tar.gz
  • Upload date:
  • Size: 9.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.12.7

File hashes

Hashes for omenicli-0.2.1.tar.gz
Algorithm Hash digest
SHA256 032c70034b39ec44a34d469d7860a589de8441c21821be7ae05f04d5a0c110c0
MD5 2f922be0611e398234ed9550fab5b913
BLAKE2b-256 8622a198c9f916541123af738469c006ac6516a7e8fcf2196b538ff8b2d86455

See more details on using hashes here.

File details

Details for the file omenicli-0.2.1-py3-none-any.whl.

File metadata

  • Download URL: omenicli-0.2.1-py3-none-any.whl
  • Upload date:
  • Size: 8.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.12.7

File hashes

Hashes for omenicli-0.2.1-py3-none-any.whl
Algorithm Hash digest
SHA256 915f78962f5e584d855d3df4127c06d2fb20e84dded2620bb13476c7ab390e12
MD5 b9f69c2001565e5f41c6405c825d4a7e
BLAKE2b-256 ca771d2c1bd85d0643b9eea471e5e01708177e184dede529941604e06683dc4f

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page