A unified, production-ready Python client for OpenAI, Anthropic, Google Gemini, Mistral AI, DeepSeek, AWS Bedrock, HuggingFace, and Perplexity AI.
Project description
Wrapper4AI
Multi-provider, pluggable LLM wrapper with token counting, history management, streaming, and seamless extensibility.
Unified Python client for OpenAI, Anthropic, Google Gemini, Mistral AI, DeepSeek, AWS Bedrock, Hugging Face, and Perplexity AI.
Features
- Unified interface for multiple LLM providers
- Chat history with optional tracking and token trimming
- Streaming support for all providers
- Token counting (tiktoken when available)
- Provider registry — register custom backends via
BaseProvider - Config from env — API keys and regions from environment variables
Installation
pip install wrapper4ai
Install with optional dependencies for the providers you use:
# One provider
pip install wrapper4ai[openai]
# Multiple
pip install wrapper4ai[openai,anthropic,google,mistral]
# All providers
pip install wrapper4ai[all]
Optional extras: openai, anthropic, google, mistral, bedrock, huggingface, all, dev.
Supported Providers
| Provider | Default model | Optional extra |
|---|---|---|
| OpenAI | gpt-4o |
openai |
| Anthropic | claude-sonnet-4-20250514 |
anthropic |
gemini-2.0-flash |
google |
|
| Mistral AI | mistral-large-latest |
mistral |
| DeepSeek | deepseek-chat |
— |
| AWS Bedrock | Claude / Llama / Titan | bedrock |
| Hugging Face | HuggingFaceH4/zephyr-7b-beta |
huggingface |
| Perplexity | sonar-pro |
— |
Usage
Connect and chat
from wrapper4ai import connect
client = connect("openai", "gpt-4o", api_key="sk-...")
response = client.chat("Tell me a joke.")
print(response)
API keys can be set via environment variables (e.g. OPENAI_API_KEY, ANTHROPIC_API_KEY, GOOGLE_API_KEY, MISTRAL_API_KEY) or passed as api_key=.
Streaming
client = connect("anthropic", "claude-sonnet-4-20250514")
for chunk in client.stream("Write a short poem."):
print(chunk, end="")
With history
History is kept by default; you can clear it or turn it off per call:
client = connect("openai", "gpt-4o")
client.chat("My name is Alex.")
client.chat("What is my name?") # uses history
client.chat("Hello!", use_history=False) # no history update
client.clear_history()
Stateless completion
messages = [
{"role": "user", "content": "What is 2+2?"},
]
reply = client.complete(messages) # does not change client history
System prompt and token counting
client = connect("openai", "gpt-4o", system_prompt="You are a helpful assistant.")
client.chat("Hi!")
n = client.count_tokens("Some text")
n = client.count_tokens(client.history)
total = client.total_tokens_used
title = client.generate_title("A long discussion about AI")
List providers and custom backends
from wrapper4ai import list_providers, connect
from wrapper4ai.providers import BaseProvider, register_provider
print(list_providers()) # ['anthropic', 'bedrock', 'deepseek', 'google', ...]
class MyProvider(BaseProvider):
def generate(self, messages): ...
def stream(self, messages): ...
def count_tokens(self, messages): ...
register_provider("my_backend", MyProvider)
client = connect("my_backend", "my-model")
Testing
pip install -e ".[dev]"
PYTHONPATH=. pytest tests/ -v
Tests use a mock provider; no API keys required for the test suite.
License
MIT.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file wrapper4ai-0.1.3.tar.gz.
File metadata
- Download URL: wrapper4ai-0.1.3.tar.gz
- Upload date:
- Size: 15.6 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.9
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
c53da08c93b49503a750504addc210932a8a43377d6dcf70c4009fbec270724e
|
|
| MD5 |
f9d9c3bc449e53d1df9d043b97c41ef7
|
|
| BLAKE2b-256 |
3ef90dd258f55e4ce8bc7914120a0d4c5e7f7af4b5f482a5f358422e94eb7307
|
File details
Details for the file wrapper4ai-0.1.3-py3-none-any.whl.
File metadata
- Download URL: wrapper4ai-0.1.3-py3-none-any.whl
- Upload date:
- Size: 21.9 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.9
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
a89ddc458feb9870e55cc88ad88a913f6f821d1e4b25aa5ff7c4d527a459b36e
|
|
| MD5 |
8ead495e03c0c9d2415121ec0abed532
|
|
| BLAKE2b-256 |
9890082e0000afcff9f9831688e091711fc58f96212354c1d6888834fc098f79
|