Unified multi-provider LLM API (OpenAI, Anthropic, Google, etc.)
Project description
pig-llm
Unified multi-provider LLM API for Python.
Features
- 🔌 Multi-provider support: OpenAI, Anthropic, Google, and more
- 🎯 Unified interface: Same API across all providers
- 🔄 Streaming support: Real-time token streaming
- 🛡️ Error handling: Automatic retries and fallbacks
- 📊 Usage tracking: Token counting and cost estimation
Installation
pip install pig-llm
Quick Start
from pig_llm import LLM
# Initialize with API key
llm = LLM(provider="openai", api_key="sk-...")
# Simple completion
response = llm.complete("What is the meaning of life?")
print(response.content)
# Streaming
for chunk in llm.stream("Tell me a story"):
print(chunk.content, end="", flush=True)
# With system message
response = llm.complete(
"Translate to Spanish",
system="You are a helpful translator",
)
Supported Providers
Core Providers
- OpenAI - GPT-4, GPT-3.5, etc.
- Anthropic - Claude 3, Claude 2
- Google - Gemini Pro, Gemini Ultra
- Azure OpenAI - Azure-hosted OpenAI models
Additional Providers
- Groq - Ultra-fast LLM inference
- Mistral - Mistral AI models
- OpenRouter - Access to multiple models
- Amazon Bedrock - AWS-hosted foundation models
- xAI (Grok) - xAI's Grok models
- Cerebras - Fastest inference speeds
- Cohere - Command models for enterprise
- Perplexity - Search-augmented LLMs
- DeepSeek - Chinese LLM with strong coding
- Together AI - Open-source model hosting
Configuration
from pig_llm import LLM, Config
config = Config(
provider="openai",
model="gpt-4",
temperature=0.7,
max_tokens=1000,
timeout=30,
)
llm = LLM(config=config)
Provider-Specific Examples
Amazon Bedrock
# Uses AWS credentials from environment
llm = LLM(provider="bedrock", api_key="us-east-1") # region as api_key
response = llm.complete("Hello", model="anthropic.claude-3-sonnet-20240229-v1:0")
xAI (Grok)
llm = LLM(provider="xai", api_key="xai-...")
response = llm.complete("What's happening?", model="grok-beta")
Cerebras
llm = LLM(provider="cerebras", api_key="csk-...")
response = llm.complete("Fast inference!", model="llama3.1-8b")
Cohere
llm = LLM(provider="cohere", api_key="...")
response = llm.complete("Hello", model="command-r-plus")
Perplexity
llm = LLM(provider="perplexity", api_key="pplx-...")
response = llm.complete("What's the latest news?", model="llama-3.1-sonar-large-128k-online")
# Citations available in response.metadata["citations"]
DeepSeek
llm = LLM(provider="deepseek", api_key="...")
response = llm.complete("写一段Python代码", model="deepseek-chat")
Together AI
llm = LLM(provider="together", api_key="...")
response = llm.complete("Hello", model="meta-llama/Llama-3-70b-chat-hf")
License
MIT
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
pig_llm-0.0.2.tar.gz
(12.0 kB
view details)
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
pig_llm-0.0.2-py3-none-any.whl
(23.3 kB
view details)
File details
Details for the file pig_llm-0.0.2.tar.gz.
File metadata
- Download URL: pig_llm-0.0.2.tar.gz
- Upload date:
- Size: 12.0 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.13.5
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
544d94694c3949dbb8413b66f9a4d9289aba37a92b0590823217d1e4773a00c8
|
|
| MD5 |
ea3616361da39372018152dff9137ff0
|
|
| BLAKE2b-256 |
90f0930b59e124a22bb1b6dabf85339f3a70e096a0d1367e321a245fc23905fb
|
File details
Details for the file pig_llm-0.0.2-py3-none-any.whl.
File metadata
- Download URL: pig_llm-0.0.2-py3-none-any.whl
- Upload date:
- Size: 23.3 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.13.5
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
52d2538e06c59ec4b6ef19baa3da50d6f2976e45ba4756d76ea864eee7b61e98
|
|
| MD5 |
f246a3a68db55904a056db01123a1c77
|
|
| BLAKE2b-256 |
ca1c081cf982640610373ab7abb490cf82c8a919b265821e0647f1ef20616519
|