Python SDK for building AI agents with multi-LLM support, streaming, and production-ready infrastructure
Project description
Siili AI SDK
A Python SDK for building AI agents with multi-LLM support, streaming capabilities, and production-ready infrastructure.
Features
- Multi-LLM Support: OpenAI, Anthropic, Google, Azure models
- Streaming: Real-time response streaming
- Tools: Custom tool integration with LangChain
- Production Ready: FastAPI server with REST/SSE APIs
- Type Safe: Full type annotations
Quick Start
Simple Agent
from dotenv import load_dotenv
from siili_ai_sdk.agent.base_agent import BaseAgent
class HelloAgent(BaseAgent):
def __init__(self):
super().__init__(system_prompt="You're an unhelpful assistant that cant resist constantly talking about cats.")
if __name__ == "__main__":
load_dotenv()
agent = HelloAgent()
print(agent.get_response_text("Hello"))
Agent with Tools + FastAPI Server
from typing import List
import uvicorn
from dotenv import load_dotenv
from fastapi import FastAPI
from fastapi.middleware.cors import CORSMiddleware
from siili_ai_sdk.agent.base_agent import BaseAgent
from siili_ai_sdk.tools.tool_provider import ToolProvider, tool, BaseTool
from siili_ai_sdk.server.api.routers.api_builder import ApiBuilder
from siili_ai_sdk.models.model_registry import ModelRegistry
load_dotenv()
app = FastAPI(title="Agent SDK Backend Example", version="0.0.1")
app.add_middleware(
CORSMiddleware,
allow_origins=["*"],
allow_credentials=True,
allow_methods=["*"],
allow_headers=["*"],
)
class DemoTool(ToolProvider):
def get_tools(self) -> List[BaseTool]:
@tool
def get_secret_greeting() -> str:
"""Returns the users secret greeting."""
return "kikkelis kokkelis"
@tool
def get_user_name() -> str:
"""Returns the users name."""
return "Seppo Hovi"
return [get_secret_greeting, get_user_name]
class DemoAgent(BaseAgent):
def get_tool_providers(self) -> List[ToolProvider]:
return [DemoTool()]
@app.on_event("startup")
async def startup_event():
agent = DemoAgent(llm_model=ModelRegistry.CLAUDE_4_SONNET)
api_builder = ApiBuilder.local(agent=agent)
thread_router = api_builder.build_thread_router()
app.include_router(thread_router)
def run_server():
uvicorn.run("main:app", host="0.0.0.0", port=8000, reload=True, log_level="info")
if __name__ == "__main__":
run_server()
Configuration
Set up your environment variables:
# API Keys
ANTHROPIC_API_KEY=your-anthropic-key
OPENAI_API_KEY=your-openai-key
GOOGLE_API_KEY=your-google-key
# Azure (optional)
AZURE_API_KEY=your-azure-key
AZURE_ENDPOINT=https://your-resource.openai.azure.com/
Available Models
from siili_ai_sdk.models.model_registry import ModelRegistry
# Use model registry for easy access
ModelRegistry.CLAUDE_4_SONNET
ModelRegistry.GPT_4_1
ModelRegistry.GEMINI_2_5_FLASH
ModelRegistry.O4_MINI
# Or use aliases
ModelRegistry.from_name("sonnet") # -> CLAUDE_4_SONNET
ModelRegistry.from_name("gpt 4.1") # -> GPT_4_1
More Usage Examples
Streaming Response
async def stream_example():
agent = DemoAgent()
async for chunk in agent.get_response_stream("Write a story"):
print(chunk.content, end="", flush=True)
Structured Response
from pydantic import BaseModel
class Recipe(BaseModel):
name: str
ingredients: list[str]
instructions: list[str]
response = agent.get_structured_response("Create a pasta recipe", Recipe)
print(f"Recipe: {response.name}")
Interactive CLI
agent = DemoAgent()
agent.run_cli() # Starts interactive chat
API Endpoints
When running the FastAPI server:
# Health check
GET /health
# Create thread
POST /threads
# Send message (streaming)
POST /threads/{thread_id}/messages/stream
# Get messages
GET /threads/{thread_id}/messages
Development
# Setup
./setup.sh
# Run tests
make test
# Format code
ruff check --fix
ruff format
Project Structure
siili_ai_sdk/
├── agent/ # Core agent implementations
├── llm/ # LangChain service and streaming
├── models/ # LLM providers and configuration
├── thread/ # Conversation management
├── tools/ # Tool system
├── server/ # FastAPI server infrastructure
└── utils/ # Shared utilities
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file siili_ai_sdk-0.5.1.tar.gz.
File metadata
- Download URL: siili_ai_sdk-0.5.1.tar.gz
- Upload date:
- Size: 455.5 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: uv/0.9.21 {"installer":{"name":"uv","version":"0.9.21","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
e669279f7eb50583ca25fb6a59b466aa01642b9a7ff9d7b7b89c6d802b71192f
|
|
| MD5 |
8e46eaaa789b8845909329c366357047
|
|
| BLAKE2b-256 |
d94ef1c87454ffe33185c1c71d64ade707d2e720c0a51234f0ae5ec101607249
|
File details
Details for the file siili_ai_sdk-0.5.1-py3-none-any.whl.
File metadata
- Download URL: siili_ai_sdk-0.5.1-py3-none-any.whl
- Upload date:
- Size: 112.6 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: uv/0.9.21 {"installer":{"name":"uv","version":"0.9.21","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
0580b2af4480cb232e73136c9725fe9bb805f74475d4e172a29ddadded41a997
|
|
| MD5 |
e07a022227fe0137414b411d7060127e
|
|
| BLAKE2b-256 |
dfda665e629620c1656216e7e43d283afd3dc2bc7ed847223135573583edc412
|