Skip to main content

OpenAI client for the datapizza-ai framework

Project description

DataPizza AI - OpenAI-Like Client

A versatile client for DataPizza AI that supports OpenAI-compatible APIs, including local models through Ollama, Together AI, and other OpenAI-compatible services.

Installation

pip install datapizza-ai-clients-openai-like

Quick Start

With Ollama (Local Models)

from datapizza.clients.openai_like import OpenAILikeClient

# Create client for Ollama
client = OpenAILikeClient(
    api_key="",  # Ollama doesn't require an API key
    model="gemma2:2b",
    system_prompt="You are a helpful assistant.",
    base_url="http://localhost:11434/v1",
)

response = client.invoke("What is the capital of France?")
print(response.content)

With Together AI

import os
from datapizza.clients.openai_like import OpenAILikeClient

client = OpenAILikeClient(
    api_key=os.getenv("TOGETHER_API_KEY"),
    model="meta-llama/Llama-2-7b-chat-hf",
    system_prompt="You are a helpful assistant.",
    base_url="https://api.together.xyz/v1",
)

response = client.invoke("Explain quantum computing")
print(response.content)

With Other OpenAI-Compatible Services

import os
from datapizza.clients.openai_like import OpenAILikeClient

client = OpenAILikeClient(
    api_key=os.getenv("YOUR_API_KEY"),
    model="your-model-name",
    system_prompt="You are a helpful assistant.",
    base_url="https://your-service-url/v1",
)

response = client.invoke("Your question here")
print(response.content)

Features

  • OpenAI-Compatible: Works with any service that implements the OpenAI API standard
  • Local Models: Perfect for running with Ollama for privacy and cost control
  • Memory Support: Built-in memory adapter for conversation history
  • Streaming: Support for real-time streaming responses
  • Structured Outputs: Generate structured data with Pydantic models
  • Tool Calling: Function calling capabilities where supported

Supported Services

  • Ollama - Local model inference
  • Together AI - Cloud-based model hosting
  • Perplexity AI - Search-augmented models
  • Groq - Fast inference API
  • Any OpenAI-compatible API

Advanced Usage

With Memory

from datapizza.clients.openai_like import OpenAILikeClient
from datapizza.memory import Memory

client = OpenAILikeClient(
    api_key="",
    model="llama3.1:8b",
    base_url="http://localhost:11434/v1",
)

memory = Memory(client=client)
memory.add("I'm working on a Python project about machine learning.")
response = memory.query("What libraries should I use?")

Streaming Responses

client = OpenAILikeClient(
    api_key="",
    model="gemma2:7b",
    base_url="http://localhost:11434/v1",
)

for chunk in client.stream("Tell me a story about AI"):
    print(chunk.content, end="", flush=True)

Structured Outputs

from pydantic import BaseModel
from datapizza.clients.openai_like import OpenAILikeClient

class Person(BaseModel):
    name: str
    age: int
    occupation: str

client = OpenAILikeClient(
    api_key="",
    model="llama3.1:8b",
    base_url="http://localhost:11434/v1",
)

response = client.invoke(
    "Generate a person profile",
    response_format=Person
)
print(response.parsed)  # Person object

Configuration Options

Parameter Description Default
api_key API key for the service Required (empty string for Ollama)
model Model name to use Required
base_url Base URL for the API Required
system_prompt System message for the model None
temperature Sampling temperature (0-2) 0.7
max_tokens Maximum tokens in response None
timeout Request timeout in seconds 30

Ollama Setup

  1. Install Ollama from ollama.ai
  2. Pull a model: ollama pull gemma2:2b
  3. Start Ollama: ollama serve
  4. Use with DataPizza AI as shown in the examples above

Popular Ollama Models

  • gemma2:2b - Lightweight, fast responses
  • gemma2:7b - Balanced performance
  • llama3.1:8b - High quality, more resource intensive
  • codellama:7b - Specialized for coding tasks
  • mistral:7b - Good general purpose model

Error Handling

from datapizza.clients.openai_like import OpenAILikeClient
from datapizza.core.clients.exceptions import ClientError

try:
    client = OpenAILikeClient(
        api_key="",
        model="nonexistent-model",
        base_url="http://localhost:11434/v1",
    )
    response = client.invoke("Hello")
except ClientError as e:
    print(f"Client error: {e}")
except Exception as e:
    print(f"Unexpected error: {e}")

Contributing

Contributions are welcome! Please see our Contributing Guide for details.

License

This project is licensed under the MIT License - see the LICENSE file for details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

datapizza_ai_clients_openai_like-0.0.2.tar.gz (8.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

File details

Details for the file datapizza_ai_clients_openai_like-0.0.2.tar.gz.

File metadata

File hashes

Hashes for datapizza_ai_clients_openai_like-0.0.2.tar.gz
Algorithm Hash digest
SHA256 d907ba7eea788094b32ff341d27010c3252d69f45d45932b289d4810314f6a95
MD5 8b9c4507ac9abc7f14e72ed2ae5eef0f
BLAKE2b-256 2c016466358ebea3f64df980541ccb32244500be5530b087c6ac0605549588bc

See more details on using hashes here.

File details

Details for the file datapizza_ai_clients_openai_like-0.0.2-py3-none-any.whl.

File metadata

File hashes

Hashes for datapizza_ai_clients_openai_like-0.0.2-py3-none-any.whl
Algorithm Hash digest
SHA256 78295467b1fcca5df2f21e2072908aff476c66d627231bfc690558c1fcbd9d69
MD5 93b9790c6c66d3a796a11011f0048cfa
BLAKE2b-256 29af40e575df8ae168dbb7f18d98c97e05d5816c44c98dfa4b18066890d1b067

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page