Skip to main content

A Simple Unified LLM API interface for OpenAI, Gemini, Mistral, Groq and more.

Project description

๐Ÿ”Œ pyllmlib

PyPI Downloads PyPI Version Python Version License GitHub Stars


Cropped Image

pyllmlib is a lightweight, provider-agnostic Python package that gives you one simple interface to work with multiple Large Language Model (LLM) APIs.

Stop juggling different SDKs and client libraries โ€” whether itโ€™s OpenAI GPT, Google Gemini, Mistral AI, or Groq, you write your code once and switch providers in seconds.

๐ŸŽฏ Ideal for developers who want to:

  • Experiment with different LLMs quickly
  • Build multi-provider AI applications
  • Avoid vendor lock-in with a consistent API

โœจ Features

  • ๐Ÿ”Œ Unified API โ€” A single generate() function for all providers
  • ๐ŸŒ Multi-Provider Support โ€” OpenAI, Gemini, Mistral, Groq (with more coming soon)
  • ๐Ÿง  Consistent Message Format โ€” Same request style across providers
  • ๐Ÿ” Flexible Config โ€” Use env vars, inline setup, or config files
  • ๐Ÿ“ฆ Minimal Dependencies โ€” Only needs requests
  • ๐Ÿ”„ Quick Provider Switching โ€” Change models with one line
  • ๐Ÿ›ก๏ธ Automatic Token Handling โ€” Prevents overflows & context errors
  • ๐Ÿ“œ Role-Based Conversations โ€” System, user, assistant messages
  • ๐Ÿ”ง Extensible โ€” Add your own providers with minimal code
  • ๐Ÿš€ No Vendor Lock-In โ€” Swap providers without rewriting logic

๐Ÿ“ฆ Installation

From PyPI (recommended):

pip install pyllmlib

From GitHub:

# Latest release
pip install git+https://github.com/yazirofi/pyllmlib.git

# Development version
git clone https://github.com/yazirofi/pyllmlib.git
cd pyllmlib
pip install -e .

Requirements:

  • Python 3.7+
  • requests (installed automatically)

๐Ÿš€ Quick Start

from pyllmlib import config, generate

# Configure your preferred LLM
config(
    provider="openai",
    api_key="your-openai-api-key",
    model="gpt-4"
)

# Generate text
response = generate("Explain quantum computing in simple terms")
print(response)

โœ… Same code works with any provider โ€” just change the config.


โš™๏ธ Configuration

1. Direct in Code

from pyllmlib import config

# OpenAI
config(provider="openai", api_key="sk-...", model="gpt-4")

# Google Gemini
config(provider="gemini", api_key="AIza...", model="gemini-2.5-flash")

# Mistral
config(provider="mistral", api_key="...", model="mistral-large-latest")

# Groq
config(provider="groq", api_key="gsk_...", model="mixtral-8x7b-32768")

2. Environment Variables

LLM_PROVIDER=openai
LLM_API_KEY=sk-your-openai-key
LLM_MODEL=gpt-4
LLM_BASE_URL=https://api.openai.com/v1  # Optional
from pyllmlib import config, generate

config()  # Loads from env
print(generate("What is LLM?"))

๐Ÿ’ฌ Usage Examples

Text Generation

from pyllmlib import config, generate

config(provider="gemini", api_key="AIza...", model="gemini-2.5-flash")

print(generate("What is the capital of France?"))

prompt = """
Write a Python function to calculate factorial with error handling and docstring.
"""
print(generate(prompt))

Interactive Chat

from pyllmlib import config, chat, reset_chat

config(provider="gemini", api_key="AIza...", model="gemini-2.5-flash")

while (q := input("Ask: ")):
    print(chat(q))

reset_chat()

๐ŸŒ Supported Providers

โœ… Currently Supported

  • OpenAI โ€” gpt-4, gpt-4-turbo, gpt-3.5-turbo
  • Google Gemini โ€” gemini-2.5-flash, gemini-1.5-flash
  • Mistral AI โ€” mistral-large-latest, mistral-small-latest
  • Groq โ€” mixtral-8x7b-32768, llama2-70b-4096, gemma-7b-it

๐Ÿ”œ Coming Soon

  • Anthropic Claude
  • Cohere
  • Ollama & LM Studio (local hosting)
  • Hugging Face models

๐Ÿ› Troubleshooting

  • Auth Errors โ†’ Check API key format
  • Model Not Found โ†’ Verify model name is correct

๐Ÿ“Š Best Practices

# โœ… Reuse config for multiple prompts
config(provider="openai", api_key="sk-...", model="gpt-4")
for p in prompts:
    print(generate(p))

# โŒ Donโ€™t reconfigure on every request

๐Ÿ’ก Cost Optimization: Use gpt-3.5-turbo for simple tasks, gpt-4 for complex ones.


๐Ÿ“š API Reference

  • config(**kwargs) โ†’ Set provider, API key, model
  • generate(prompt, **kwargs) โ†’ Single text output
  • generate_stream(prompt, **kwargs) โ†’ Streaming output
  • chat(message) โ†’ Conversational interface
  • reset_chat() โ†’ Clear conversation history

๐Ÿ“„ License

Licensed under the MIT License โ€“ see LICENSE.


๐Ÿ‘จ๐Ÿ’ป Author

Shay Yazirofi


โญ If you find pyllmlib useful, please star the repo on GitHub! ๐Ÿ“– More docs & tutorials: Wiki


Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pyllmlib-0.1.0.tar.gz (10.5 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

pyllmlib-0.1.0-py3-none-any.whl (10.1 kB view details)

Uploaded Python 3

File details

Details for the file pyllmlib-0.1.0.tar.gz.

File metadata

  • Download URL: pyllmlib-0.1.0.tar.gz
  • Upload date:
  • Size: 10.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.5

File hashes

Hashes for pyllmlib-0.1.0.tar.gz
Algorithm Hash digest
SHA256 707b2f39a758d399a6cca7929b7bf884b6be7549ca0e96dda8314eee76ce3276
MD5 a4a3efcd20e046f76af761d747472776
BLAKE2b-256 17c0c2482204d9b051c3b0038b7e0d90e6a8d6b58a045369538643841c1a4d69

See more details on using hashes here.

File details

Details for the file pyllmlib-0.1.0-py3-none-any.whl.

File metadata

  • Download URL: pyllmlib-0.1.0-py3-none-any.whl
  • Upload date:
  • Size: 10.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.5

File hashes

Hashes for pyllmlib-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 68ff631841ec9a2f3fa967ae929a1f6b2e9003f636985124b55aff9f1e5dfe1f
MD5 a79caf1b5d52e06bb6e740c61e89b522
BLAKE2b-256 9cbd04b6a9a2ded1f255b23126f32eee192bcad98d8503dd7c67d7f572730c93

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page