Skip to main content

Minimal, reusable AI service handlers for Gemini and other LLMs

Project description

AI Proxy Core

A minimal Python package providing reusable AI service handlers for Gemini and other LLMs. No web framework dependencies - just the core logic.

Installation

pip install ai-proxy-core

Or install from source:

git clone https://github.com/ebowwa/ai-proxy-core.git
cd ai-proxy-core
pip install -e .

Usage

Completions Handler

from ai_proxy_core import CompletionsHandler

# Initialize handler
handler = CompletionsHandler(api_key="your-gemini-api-key")

# Create completion
response = await handler.create_completion(
    messages=[
        {"role": "user", "content": "Hello, how are you?"}
    ],
    model="gemini-1.5-flash",
    temperature=0.7
)

print(response["choices"][0]["message"]["content"])

Gemini Live Session

from ai_proxy_core import GeminiLiveSession

# Create session
session = GeminiLiveSession(api_key="your-gemini-api-key")

# Set up callbacks
session.on_audio = lambda data: print(f"Received audio: {len(data)} bytes")
session.on_text = lambda text: print(f"Received text: {text}")

# Start session
await session.start()

# Send audio/text
await session.send_audio(audio_data)
await session.send_text("Hello!")

# Stop when done
await session.stop()

Integration with FastAPI

from fastapi import FastAPI, HTTPException
from pydantic import BaseModel
from ai_proxy_core import CompletionsHandler

app = FastAPI()
handler = CompletionsHandler()

class CompletionRequest(BaseModel):
    messages: list
    model: str = "gemini-1.5-flash"
    temperature: float = 0.7

@app.post("/api/chat/completions")
async def create_completion(request: CompletionRequest):
    try:
        response = await handler.create_completion(
            messages=request.messages,
            model=request.model,
            temperature=request.temperature
        )
        return response
    except Exception as e:
        raise HTTPException(status_code=500, detail=str(e))

Features

  • No framework dependencies - Use with FastAPI, Flask, or any Python app
  • Async/await support - Modern async Python
  • Type hints - Full type annotations
  • Minimal surface area - Just the core logic you need
  • Easy testing - Mock the handlers in your tests

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ai_proxy_core-0.1.1.tar.gz (6.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

ai_proxy_core-0.1.1-py3-none-any.whl (7.4 kB view details)

Uploaded Python 3

File details

Details for the file ai_proxy_core-0.1.1.tar.gz.

File metadata

  • Download URL: ai_proxy_core-0.1.1.tar.gz
  • Upload date:
  • Size: 6.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.10.0

File hashes

Hashes for ai_proxy_core-0.1.1.tar.gz
Algorithm Hash digest
SHA256 6877dba0bc7139a161026eb81aa6daa9f29089bcfd6225a3550566ff0722a4a8
MD5 022f12b6a70d0e203095ff90fcbc3a94
BLAKE2b-256 340035332137b12e4e7aa580fa20269fc57c368f01b42c25e24aed2df6463b79

See more details on using hashes here.

File details

Details for the file ai_proxy_core-0.1.1-py3-none-any.whl.

File metadata

  • Download URL: ai_proxy_core-0.1.1-py3-none-any.whl
  • Upload date:
  • Size: 7.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.10.0

File hashes

Hashes for ai_proxy_core-0.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 db058bfdad1b0c235be5c347e2ca28989c2d7bb8eca2b3236011d85fcc8fd2ff
MD5 26fa08427df838bea773607b05596704
BLAKE2b-256 2a30d257647f757c554aa80ce27b6d6222ba90a39013b89c8d03419eee0ae86a

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page