Unified multi-provider LLM API (OpenAI, Anthropic, Google, etc.)
Project description
pig-llm
Unified multi-provider LLM API for Python.
Features
- 🔌 Multi-provider support: OpenAI, Anthropic, Google, and more
- 🎯 Unified interface: Same API across all providers
- 🔄 Streaming support: Real-time token streaming
- 🛡️ Error handling: Automatic retries and fallbacks
- 📊 Usage tracking: Token counting and cost estimation
Installation
pip install pig-llm
Quick Start
from pig_llm import LLM
# Initialize with API key
llm = LLM(provider="openai", api_key="sk-...")
# Simple completion
response = llm.complete("What is the meaning of life?")
print(response.content)
# Streaming
for chunk in llm.stream("Tell me a story"):
print(chunk.content, end="", flush=True)
# With system message
response = llm.complete(
"Translate to Spanish",
system="You are a helpful translator",
)
Supported Providers
Core Providers
- OpenAI - GPT-4, GPT-3.5, etc.
- Anthropic - Claude 3, Claude 2
- Google - Gemini Pro, Gemini Ultra
- Azure OpenAI - Azure-hosted OpenAI models
Additional Providers
- Groq - Ultra-fast LLM inference
- Mistral - Mistral AI models
- OpenRouter - Access to multiple models
- Amazon Bedrock - AWS-hosted foundation models
- xAI (Grok) - xAI's Grok models
- Cerebras - Fastest inference speeds
- Cohere - Command models for enterprise
- Perplexity - Search-augmented LLMs
- DeepSeek - Chinese LLM with strong coding
- Together AI - Open-source model hosting
Configuration
from pig_llm import LLM, Config
config = Config(
provider="openai",
model="gpt-4",
temperature=0.7,
max_tokens=1000,
timeout=30,
)
llm = LLM(config=config)
Provider-Specific Examples
Amazon Bedrock
# Uses AWS credentials from environment
llm = LLM(provider="bedrock", api_key="us-east-1") # region as api_key
response = llm.complete("Hello", model="anthropic.claude-3-sonnet-20240229-v1:0")
xAI (Grok)
llm = LLM(provider="xai", api_key="xai-...")
response = llm.complete("What's happening?", model="grok-beta")
Cerebras
llm = LLM(provider="cerebras", api_key="csk-...")
response = llm.complete("Fast inference!", model="llama3.1-8b")
Cohere
llm = LLM(provider="cohere", api_key="...")
response = llm.complete("Hello", model="command-r-plus")
Perplexity
llm = LLM(provider="perplexity", api_key="pplx-...")
response = llm.complete("What's the latest news?", model="llama-3.1-sonar-large-128k-online")
# Citations available in response.metadata["citations"]
DeepSeek
llm = LLM(provider="deepseek", api_key="...")
response = llm.complete("写一段Python代码", model="deepseek-chat")
Together AI
llm = LLM(provider="together", api_key="...")
response = llm.complete("Hello", model="meta-llama/Llama-3-70b-chat-hf")
License
MIT
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
pig_llm-0.0.1.tar.gz
(12.0 kB
view details)
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
pig_llm-0.0.1-py3-none-any.whl
(23.4 kB
view details)
File details
Details for the file pig_llm-0.0.1.tar.gz.
File metadata
- Download URL: pig_llm-0.0.1.tar.gz
- Upload date:
- Size: 12.0 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.13.5
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
4c082243d6b62e0cab029b50e34e9ad5cc057ff22e49157ccdb27c366c15ac39
|
|
| MD5 |
a85116fab417fdd848ba5cd3d712308d
|
|
| BLAKE2b-256 |
10294b8629ae2d3e26d618c7b69b6444882c1ca8a94425f43f86715a975524dd
|
File details
Details for the file pig_llm-0.0.1-py3-none-any.whl.
File metadata
- Download URL: pig_llm-0.0.1-py3-none-any.whl
- Upload date:
- Size: 23.4 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.13.5
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
a0906a8ff9fc4028a79844013cf985c77d257dd7cfd3b47fb4d862852e68f0d5
|
|
| MD5 |
772b99903ad22d6713de6c1243704fea
|
|
| BLAKE2b-256 |
075d548bc59e2fb7dde3a647ab28c633ed4a332b374b24626f1e19704532b107
|