Skip to main content

Unified LLM API interface for OpenAI, Gemini, Mistral, Groq etc.

Project description

🔌 plugllm

plugllm is a unified and provider-agnostic Python package that lets you interact with multiple LLM APIs (like OpenAI, Gemini, Mistral, Groq, etc.) using a single, consistent interface — without needing to learn each provider’s SDK.

Created by Yash Kumar Firoziya


🌟 Features

  • 🔌 Unified API — One interface for all providers
  • 📡 Supports multiple providers — OpenAI, Gemini, Mistral, Groq (more coming)
  • 🧠 Same request structure — Compatible message format across providers
  • 🔐 Secure & simple config — Use environment variables or inline setup
  • 🚫 No SDKs required — Only uses Python requests library
  • 📜 Role-based prompt support — For both single and multi-turn chat
  • 🔄 Extensible — Add custom providers easily

📦 Installation

pip install plugllm

⚙️ Configuration

You can configure directly in your code:

from plugllm import config

config(
    provider="openai",       # "gemini", "mistral", "groq" also supported
    api_key="your-api-key",
    model="gpt-4",           # model name based on provider
    base_url=None            # optional: custom or local API endpoint
)

Or use environment variables for better security:

export LLM_PROVIDER=openai
export LLM_API_KEY=your-api-key
export LLM_MODEL=gpt-4

💬 Basic Usage

from plugllm import generate

response = generate("What is quantum entanglement?")
print(response)

🧵 Multi-turn Chat

generate([
    {"role": "system", "content": "You are a helpful assistant."},
    {"role": "user", "content": "What are black holes?"}
])

📡 Supported Providers

  • OpenAI (ChatGPT, GPT-4, GPT-3.5)
  • Google Gemini
  • Mistral AI
  • GroqCloud (Mixtral)
  • Coming soon: Cohere, Anthropic Claude, Ollama, LM Studio

🗂️ Project Structure

plugllm/
├── __init__.py
├── core.py
├── config.py
├── prompts.py
└── providers/
    ├── base.py
    ├── openai.py
    ├── gemini.py
    ├── mistral.py
    └── groq.py

🤝 Contributing

Pull requests are welcome! If you want to add support for a new provider, just create a new module in providers/ based on base.py.


🪪 License

This project is licensed under the MIT License.


✨ Author

Yash Kumar Firoziya


Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

plugllm-0.1.1.tar.gz (6.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

plugllm-0.1.1-py3-none-any.whl (7.3 kB view details)

Uploaded Python 3

File details

Details for the file plugllm-0.1.1.tar.gz.

File metadata

  • Download URL: plugllm-0.1.1.tar.gz
  • Upload date:
  • Size: 6.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.5

File hashes

Hashes for plugllm-0.1.1.tar.gz
Algorithm Hash digest
SHA256 4b28c2b5c74f9e780bf3508b1104c4ed3e390283bc33b22b9bfe6cf85c0711db
MD5 f862e70f6119231bbbf084b73bc091b3
BLAKE2b-256 5f94ce90af16fc5da658ef6d95ab27c35772ab85b611bfd1ffe750506032983f

See more details on using hashes here.

File details

Details for the file plugllm-0.1.1-py3-none-any.whl.

File metadata

  • Download URL: plugllm-0.1.1-py3-none-any.whl
  • Upload date:
  • Size: 7.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.5

File hashes

Hashes for plugllm-0.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 863ee3bf7432597789c20788c23c92c6beb1554cfc837ccf3d9694e2f9d8a59d
MD5 39528c185f78b75f3a36a5f23dae67ef
BLAKE2b-256 c4737127544fdccd3147c4633eb10ec8ea9ff10389b5a90eab228bead90281eb

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page