OpenAI client for the datapizza-ai framework
Project description
DataPizza AI - OpenAI-Like Client
A versatile client for DataPizza AI that supports OpenAI-compatible APIs, including local models through Ollama, Together AI, and other OpenAI-compatible services.
Installation
pip install datapizza-ai-clients-openai-like
Quick Start
With Ollama (Local Models)
from datapizza.clients.openai_like import OpenAILikeClient
# Create client for Ollama
client = OpenAILikeClient(
api_key="", # Ollama doesn't require an API key
model="gemma2:2b",
system_prompt="You are a helpful assistant.",
base_url="http://localhost:11434/v1",
)
response = client.invoke("What is the capital of France?")
print(response.content)
With Together AI
import os
from datapizza.clients.openai_like import OpenAILikeClient
client = OpenAILikeClient(
api_key=os.getenv("TOGETHER_API_KEY"),
model="meta-llama/Llama-2-7b-chat-hf",
system_prompt="You are a helpful assistant.",
base_url="https://api.together.xyz/v1",
)
response = client.invoke("Explain quantum computing")
print(response.content)
With OpenRouter
import os
from datapizza.clients.openai_like import OpenAILikeClient
client = OpenAILikeClient(
api_key=os.getenv("OPENROUTER_API_KEY"),
model="google/gemma-7b-it",
system_prompt="You are a helpful assistant.",
base_url="https://openrouter.ai/api/v1",
)
response = client.invoke("What is OpenRouter?")
print(response.content)
With Other OpenAI-Compatible Services
import os
from datapizza.clients.openai_like import OpenAILikeClient
client = OpenAILikeClient(
api_key=os.getenv("YOUR_API_KEY"),
model="your-model-name",
system_prompt="You are a helpful assistant.",
base_url="https://your-service-url/v1",
)
response = client.invoke("Your question here")
print(response.content)
Features
- OpenAI-Compatible: Works with any service that implements the OpenAI API standard
- Local Models: Perfect for running with Ollama for privacy and cost control
- Memory Support: Built-in memory adapter for conversation history
- Streaming: Support for real-time streaming responses
- Structured Outputs: Generate structured data with Pydantic models
- Tool Calling: Function calling capabilities where supported
Supported Services
- Ollama - Local model inference
- Together AI - Cloud-based model hosting
- OpenRouter - Access a variety of models through a single API
- Perplexity AI - Search-augmented models
- Groq - Fast inference API
- Any OpenAI-compatible API
Advanced Usage
With Memory
from datapizza.clients.openai_like import OpenAILikeClient
from datapizza.memory import Memory
client = OpenAILikeClient(
api_key="",
model="llama3.1:8b",
base_url="http://localhost:11434/v1",
)
memory = Memory(client=client)
memory.add("I'm working on a Python project about machine learning.")
response = memory.query("What libraries should I use?")
Streaming Responses
client = OpenAILikeClient(
api_key="",
model="gemma2:7b",
base_url="http://localhost:11434/v1",
)
for chunk in client.stream("Tell me a story about AI"):
print(chunk.content, end="", flush=True)
Structured Outputs
from pydantic import BaseModel
from datapizza.clients.openai_like import OpenAILikeClient
class Person(BaseModel):
name: str
age: int
occupation: str
client = OpenAILikeClient(
api_key="",
model="llama3.1:8b",
base_url="http://localhost:11434/v1",
)
response = client.invoke(
"Generate a person profile",
response_format=Person
)
print(response.parsed) # Person object
Configuration Options
| Parameter | Description | Default |
|---|---|---|
api_key |
API key for the service | Required (empty string for Ollama) |
model |
Model name to use | Required |
base_url |
Base URL for the API | Required |
system_prompt |
System message for the model | None |
temperature |
Sampling temperature (0-2) | 0.7 |
max_tokens |
Maximum tokens in response | None |
timeout |
Request timeout in seconds | 30 |
Ollama Setup
- Install Ollama from ollama.ai
- Pull a model:
ollama pull gemma2:2b - Start Ollama:
ollama serve - Use with DataPizza AI as shown in the examples above
Popular Ollama Models
gemma2:2b- Lightweight, fast responsesgemma2:7b- Balanced performancellama3.1:8b- High quality, more resource intensivecodellama:7b- Specialized for coding tasksmistral:7b- Good general purpose model
Error Handling
from datapizza.clients.openai_like import OpenAILikeClient
from datapizza.core.clients.exceptions import ClientError
try:
client = OpenAILikeClient(
api_key="",
model="nonexistent-model",
base_url="http://localhost:11434/v1",
)
response = client.invoke("Hello")
except ClientError as e:
print(f"Client error: {e}")
except Exception as e:
print(f"Unexpected error: {e}")
Contributing
Contributions are welcome! Please see our Contributing Guide for details.
License
This project is licensed under the MIT License - see the LICENSE file for details.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file datapizza_ai_clients_openai_like-0.0.11.tar.gz.
File metadata
- Download URL: datapizza_ai_clients_openai_like-0.0.11.tar.gz
- Upload date:
- Size: 8.7 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: python-httpx/0.28.1
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
4103089eb11872790e99cde50280d9b733c7bba03238a0362dd9dfc14fc0438d
|
|
| MD5 |
901dfaaf148c8c20b930d3dbebd4e5e7
|
|
| BLAKE2b-256 |
d630e8519d280676dd0fe0a02a7e8fd224850a73ad0a8f365c3e71dbc3bfb727
|
File details
Details for the file datapizza_ai_clients_openai_like-0.0.11-py3-none-any.whl.
File metadata
- Download URL: datapizza_ai_clients_openai_like-0.0.11-py3-none-any.whl
- Upload date:
- Size: 7.9 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: python-httpx/0.28.1
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
4307a0f1b139b1f4a09d43c20aee7deb06f1a0d119d8cce9838385d337f91f5c
|
|
| MD5 |
bd4fda411be1b59935c6751c16bff896
|
|
| BLAKE2b-256 |
8150510a1581b04bd671dbdcadee25442711b22b20748e5299c69a9935100d51
|