Minimal, reusable AI service handlers for Gemini and other LLMs
Project description
AI Proxy Core
A minimal Python package providing reusable AI service handlers for Gemini and other LLMs. No web framework dependencies - just the core logic.
Installation
pip install ai-proxy-core
Or install from source:
git clone https://github.com/ebowwa/ai-proxy-core.git
cd ai-proxy-core
pip install -e .
Usage
Completions Handler
from ai_proxy_core import CompletionsHandler
# Initialize handler
handler = CompletionsHandler(api_key="your-gemini-api-key")
# Create completion
response = await handler.create_completion(
messages=[
{"role": "user", "content": "Hello, how are you?"}
],
model="gemini-1.5-flash",
temperature=0.7
)
print(response["choices"][0]["message"]["content"])
Gemini Live Session
from ai_proxy_core import GeminiLiveSession
# Create session
session = GeminiLiveSession(api_key="your-gemini-api-key")
# Set up callbacks
session.on_audio = lambda data: print(f"Received audio: {len(data)} bytes")
session.on_text = lambda text: print(f"Received text: {text}")
# Start session
await session.start()
# Send audio/text
await session.send_audio(audio_data)
await session.send_text("Hello!")
# Stop when done
await session.stop()
Integration with FastAPI
from fastapi import FastAPI, HTTPException
from pydantic import BaseModel
from ai_proxy_core import CompletionsHandler
app = FastAPI()
handler = CompletionsHandler()
class CompletionRequest(BaseModel):
messages: list
model: str = "gemini-1.5-flash"
temperature: float = 0.7
@app.post("/api/chat/completions")
async def create_completion(request: CompletionRequest):
try:
response = await handler.create_completion(
messages=request.messages,
model=request.model,
temperature=request.temperature
)
return response
except Exception as e:
raise HTTPException(status_code=500, detail=str(e))
Features
- No framework dependencies - Use with FastAPI, Flask, or any Python app
- Async/await support - Modern async Python
- Type hints - Full type annotations
- Minimal surface area - Just the core logic you need
- Easy testing - Mock the handlers in your tests
License
MIT
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
ai_proxy_core-0.1.2.tar.gz
(6.7 kB
view details)
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file ai_proxy_core-0.1.2.tar.gz.
File metadata
- Download URL: ai_proxy_core-0.1.2.tar.gz
- Upload date:
- Size: 6.7 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.10.0
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
df367a6f4aa9354c4691cda2bfc99d46a47ccbca7d8902d4e6d346da62628d0f
|
|
| MD5 |
203746de89b676a350242f14c25eef32
|
|
| BLAKE2b-256 |
3fd5b91bf93d295a7d1cc57a489b0af6f40b34a48c5fbb49986ac2f270c211c5
|
File details
Details for the file ai_proxy_core-0.1.2-py3-none-any.whl.
File metadata
- Download URL: ai_proxy_core-0.1.2-py3-none-any.whl
- Upload date:
- Size: 7.4 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.10.0
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
40a5bca0ead0a1e6154a9a49d7c0db6551bb26d8587c2d66d47736f6fae901b5
|
|
| MD5 |
0af1d7030525cde70b84fc34a4b8cda0
|
|
| BLAKE2b-256 |
b5d8e3fa8d5c48569247ab99bd9b470f563fa03687a98f37be0a421f3c122adb
|