Skip to main content

Minimal, reusable AI service handlers for Gemini and other LLMs

Project description

AI Proxy Core

A minimal Python package providing reusable AI service handlers for Gemini and other LLMs. No web framework dependencies - just the core logic.

Installation

pip install ai-proxy-core

Or install from source:

git clone https://github.com/ebowwa/ai-proxy-core.git
cd ai-proxy-core
pip install -e .

Usage

Completions Handler

from ai_proxy_core import CompletionsHandler

# Initialize handler
handler = CompletionsHandler(api_key="your-gemini-api-key")

# Create completion
response = await handler.create_completion(
    messages=[
        {"role": "user", "content": "Hello, how are you?"}
    ],
    model="gemini-1.5-flash",
    temperature=0.7
)

print(response["choices"][0]["message"]["content"])

Gemini Live Session

from ai_proxy_core import GeminiLiveSession

# Create basic session
session = GeminiLiveSession(api_key="your-gemini-api-key")

# Create session with system prompt (string)
session = GeminiLiveSession(
    api_key="your-gemini-api-key",
    system_instruction="You are a helpful voice assistant. Be concise and friendly."
)

# Create session with system prompt (Content object for more control)
from google.genai import types
session = GeminiLiveSession(
    api_key="your-gemini-api-key",
    system_instruction=types.Content(
        parts=[types.Part.from_text("You are a pirate. Speak like a pirate!")],
        role="user"
    )
)

# Set up callbacks
session.on_audio = lambda data: print(f"Received audio: {len(data)} bytes")
session.on_text = lambda text: print(f"Received text: {text}")

# Start session
await session.start()

# Send audio/text
await session.send_audio(audio_data)
await session.send_text("Hello!")

# Stop when done
await session.stop()

Integration with FastAPI

from fastapi import FastAPI, HTTPException
from pydantic import BaseModel
from ai_proxy_core import CompletionsHandler

app = FastAPI()
handler = CompletionsHandler()

class CompletionRequest(BaseModel):
    messages: list
    model: str = "gemini-1.5-flash"
    temperature: float = 0.7

@app.post("/api/chat/completions")
async def create_completion(request: CompletionRequest):
    try:
        response = await handler.create_completion(
            messages=request.messages,
            model=request.model,
            temperature=request.temperature
        )
        return response
    except Exception as e:
        raise HTTPException(status_code=500, detail=str(e))

Features

  • No framework dependencies - Use with FastAPI, Flask, or any Python app
  • Async/await support - Modern async Python
  • Type hints - Full type annotations
  • Minimal surface area - Just the core logic you need
  • Easy testing - Mock the handlers in your tests

Development

Building the Package

When building the package for distribution, use setup.py directly instead of python -m build to avoid pip isolation issues:

python setup.py sdist bdist_wheel

This will create both source distribution and wheel files in the dist/ directory.

Publishing to PyPI

twine upload dist/*

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ai_proxy_core-0.1.5.tar.gz (7.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

ai_proxy_core-0.1.5-py3-none-any.whl (8.1 kB view details)

Uploaded Python 3

File details

Details for the file ai_proxy_core-0.1.5.tar.gz.

File metadata

  • Download URL: ai_proxy_core-0.1.5.tar.gz
  • Upload date:
  • Size: 7.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.10.0

File hashes

Hashes for ai_proxy_core-0.1.5.tar.gz
Algorithm Hash digest
SHA256 c8eafa2c80dcd0d51d10a555ae112332aa328d2aadff24fceb5e1af8e7503ee9
MD5 6d4b7903b840d9a4b3ddda385fe18e9a
BLAKE2b-256 69cbdf354ee4f56c56891fa9d35eae6e61225cfc4f43a9fb045c262497846e93

See more details on using hashes here.

File details

Details for the file ai_proxy_core-0.1.5-py3-none-any.whl.

File metadata

  • Download URL: ai_proxy_core-0.1.5-py3-none-any.whl
  • Upload date:
  • Size: 8.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.10.0

File hashes

Hashes for ai_proxy_core-0.1.5-py3-none-any.whl
Algorithm Hash digest
SHA256 94be4485e631576d2aaaeb5b8927cf02cfabfe0fc9047b9f571eb3aec6a1aec1
MD5 b0a6143f495fd1df3c6b9c25fc7a927d
BLAKE2b-256 9c57cc7059922fa219cc726a2da1926422d977b63371b53c103a5557fe078c1c

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page