Skip to main content

Minimal, reusable AI service handlers for Gemini and other LLMs

Project description

AI Proxy Core

A minimal Python package providing reusable AI service handlers for Gemini and other LLMs. No web framework dependencies - just the core logic.

Installation

pip install ai-proxy-core

Or install from source:

git clone https://github.com/ebowwa/ai-proxy-core.git
cd ai-proxy-core
pip install -e .

Usage

Completions Handler

from ai_proxy_core import CompletionsHandler

# Initialize handler
handler = CompletionsHandler(api_key="your-gemini-api-key")

# Create completion
response = await handler.create_completion(
    messages=[
        {"role": "user", "content": "Hello, how are you?"}
    ],
    model="gemini-1.5-flash",
    temperature=0.7
)

print(response["choices"][0]["message"]["content"])

Gemini Live Session

from ai_proxy_core import GeminiLiveSession

# Example 1: Basic session (no system prompt)
session = GeminiLiveSession(api_key="your-gemini-api-key")

# Example 2: Session with system prompt (simple string format)
session = GeminiLiveSession(
    api_key="your-gemini-api-key",
    system_instruction="You are a helpful voice assistant. Be concise and friendly."
)

# Example 3: Session with system prompt (Content object for more control)
from google.genai import types
session = GeminiLiveSession(
    api_key="your-gemini-api-key",
    system_instruction=types.Content(
        parts=[types.Part.from_text("You are a pirate. Speak like a pirate!")],
        role="user"
    )
)

# Set up callbacks
session.on_audio = lambda data: print(f"Received audio: {len(data)} bytes")
session.on_text = lambda text: print(f"Received text: {text}")  # Text transcription of audio

# Start session
await session.start()

# Note: When using audio responses, Gemini automatically provides both:
# - Audio data (PCM16 format) via on_audio callback
# - Text transcription of the audio via on_text callback

# Send audio/text
await session.send_audio(audio_data)
await session.send_text("Hello!")

# Stop when done
await session.stop()

Integration with FastAPI

from fastapi import FastAPI, HTTPException
from pydantic import BaseModel
from ai_proxy_core import CompletionsHandler

app = FastAPI()
handler = CompletionsHandler()

class CompletionRequest(BaseModel):
    messages: list
    model: str = "gemini-1.5-flash"
    temperature: float = 0.7

@app.post("/api/chat/completions")
async def create_completion(request: CompletionRequest):
    try:
        response = await handler.create_completion(
            messages=request.messages,
            model=request.model,
            temperature=request.temperature
        )
        return response
    except Exception as e:
        raise HTTPException(status_code=500, detail=str(e))

Features

  • No framework dependencies - Use with FastAPI, Flask, or any Python app
  • Async/await support - Modern async Python
  • Type hints - Full type annotations
  • Minimal surface area - Just the core logic you need
  • Easy testing - Mock the handlers in your tests

Development

Building the Package

When building the package for distribution, use setup.py directly instead of python -m build to avoid pip isolation issues:

python setup.py sdist bdist_wheel

This will create both source distribution and wheel files in the dist/ directory.

Publishing to PyPI

twine upload dist/*

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ai_proxy_core-0.1.7.tar.gz (8.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

ai_proxy_core-0.1.7-py3-none-any.whl (8.4 kB view details)

Uploaded Python 3

File details

Details for the file ai_proxy_core-0.1.7.tar.gz.

File metadata

  • Download URL: ai_proxy_core-0.1.7.tar.gz
  • Upload date:
  • Size: 8.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.10.0

File hashes

Hashes for ai_proxy_core-0.1.7.tar.gz
Algorithm Hash digest
SHA256 a46684c956c11b5004a117fa5728b17de9f8c5289d7ba61435b4e1d1634a9c4d
MD5 44c9c7bb46825c296b9bb21350aa045f
BLAKE2b-256 b42b9bdfc97fd921dc79bfc84bc5b3de853d1faffe06fc397b931873d213952f

See more details on using hashes here.

File details

Details for the file ai_proxy_core-0.1.7-py3-none-any.whl.

File metadata

  • Download URL: ai_proxy_core-0.1.7-py3-none-any.whl
  • Upload date:
  • Size: 8.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.10.0

File hashes

Hashes for ai_proxy_core-0.1.7-py3-none-any.whl
Algorithm Hash digest
SHA256 c79c7cc0b4af9ddee60c808054ae02fa65419964616c48c5ca950690975cdbe9
MD5 dd0f15165d33a6b774f55f95cb25cbb8
BLAKE2b-256 c52acde7a7a9c21ed40604078ba4542dfff904bd9f5fc928a61c3f56513d03d6

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page