Skip to main content

Python SDK for building AI agents with multi-LLM support, streaming, and production-ready infrastructure

Project description

Siili AI SDK

A Python SDK for building AI agents with multi-LLM support, streaming capabilities, and production-ready infrastructure.

Features

  • Multi-LLM Support: OpenAI, Anthropic, Google, Azure models
  • Streaming: Real-time response streaming
  • Tools: Custom tool integration with LangChain
  • Production Ready: FastAPI server with REST/SSE APIs
  • Type Safe: Full type annotations

Quick Start

Simple Agent

from dotenv import load_dotenv
from siili_ai_sdk.agent.base_agent import BaseAgent

class HelloAgent(BaseAgent):
    def __init__(self):
        super().__init__(system_prompt="You're an unhelpful assistant that cant resist constantly talking about cats.")

if __name__ == "__main__":
    load_dotenv()
    agent = HelloAgent()
    print(agent.get_response_text("Hello"))

Agent with Tools + FastAPI Server

from typing import List
import uvicorn
from dotenv import load_dotenv
from fastapi import FastAPI
from fastapi.middleware.cors import CORSMiddleware

from siili_ai_sdk.agent.base_agent import BaseAgent
from siili_ai_sdk.tools.tool_provider import ToolProvider, tool, BaseTool
from siili_ai_sdk.server.api.routers.api_builder import ApiBuilder
from siili_ai_sdk.models.model_registry import ModelRegistry

load_dotenv()

app = FastAPI(title="Agent SDK Backend Example", version="0.0.1")

app.add_middleware(
    CORSMiddleware,
    allow_origins=["*"],
    allow_credentials=True,
    allow_methods=["*"],
    allow_headers=["*"],
)

class DemoTool(ToolProvider):
    def get_tools(self) -> List[BaseTool]:
        @tool
        def get_secret_greeting() -> str:
            """Returns the users secret greeting."""
            return "kikkelis kokkelis"

        @tool
        def get_user_name() -> str:
            """Returns the users name."""
            return "Seppo Hovi"

        return [get_secret_greeting, get_user_name]

class DemoAgent(BaseAgent):
    def get_tool_providers(self) -> List[ToolProvider]:
        return [DemoTool()]

@app.on_event("startup")
async def startup_event():
    agent = DemoAgent(llm_model=ModelRegistry.CLAUDE_4_SONNET)
    api_builder = ApiBuilder.local(agent=agent)
    thread_router = api_builder.build_thread_router()
    app.include_router(thread_router)

def run_server():
    uvicorn.run("main:app", host="0.0.0.0", port=8000, reload=True, log_level="info")

if __name__ == "__main__":
    run_server()

Configuration

Set up your environment variables:

# API Keys
ANTHROPIC_API_KEY=your-anthropic-key
OPENAI_API_KEY=your-openai-key  
GOOGLE_API_KEY=your-google-key

# Azure (optional)
AZURE_API_KEY=your-azure-key
AZURE_ENDPOINT=https://your-resource.openai.azure.com/

Available Models

from siili_ai_sdk.models.model_registry import ModelRegistry

# Use model registry for easy access
ModelRegistry.CLAUDE_4_SONNET
ModelRegistry.GPT_4_1
ModelRegistry.GEMINI_2_5_FLASH
ModelRegistry.O4_MINI

# Or use aliases
ModelRegistry.from_name("sonnet")      # -> CLAUDE_4_SONNET
ModelRegistry.from_name("gpt 4.1")     # -> GPT_4_1

More Usage Examples

Streaming Response

async def stream_example():
    agent = DemoAgent()
    async for chunk in agent.get_response_stream("Write a story"):
        print(chunk.content, end="", flush=True)

Structured Response

from pydantic import BaseModel

class Recipe(BaseModel):
    name: str
    ingredients: list[str]
    instructions: list[str]

response = agent.get_structured_response("Create a pasta recipe", Recipe)
print(f"Recipe: {response.name}")

Interactive CLI

agent = DemoAgent()
agent.run_cli()  # Starts interactive chat

API Endpoints

When running the FastAPI server:

# Health check
GET /health

# Create thread
POST /threads

# Send message (streaming)
POST /threads/{thread_id}/messages/stream

# Get messages
GET /threads/{thread_id}/messages

Development

# Setup
./setup.sh

# Run tests
make test

# Format code
ruff check --fix
ruff format

Project Structure

siili_ai_sdk/
├── agent/          # Core agent implementations
├── llm/            # LangChain service and streaming
├── models/         # LLM providers and configuration
├── thread/         # Conversation management
├── tools/          # Tool system
├── server/         # FastAPI server infrastructure
└── utils/          # Shared utilities

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

siili_ai_sdk-0.6.0.tar.gz (459.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

siili_ai_sdk-0.6.0-py3-none-any.whl (121.7 kB view details)

Uploaded Python 3

File details

Details for the file siili_ai_sdk-0.6.0.tar.gz.

File metadata

  • Download URL: siili_ai_sdk-0.6.0.tar.gz
  • Upload date:
  • Size: 459.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.9.21 {"installer":{"name":"uv","version":"0.9.21","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}

File hashes

Hashes for siili_ai_sdk-0.6.0.tar.gz
Algorithm Hash digest
SHA256 adfab9380e80ca199a743f1e007f83cd2769d9ec7e8aaaace5bea2edba9f698a
MD5 23bdb261322d7a00b005b42a4ce1a149
BLAKE2b-256 18c5b80df178870565abb0af97ebb01bc799d859ccb2e681c3d42e2fc4a74e51

See more details on using hashes here.

File details

Details for the file siili_ai_sdk-0.6.0-py3-none-any.whl.

File metadata

  • Download URL: siili_ai_sdk-0.6.0-py3-none-any.whl
  • Upload date:
  • Size: 121.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.9.21 {"installer":{"name":"uv","version":"0.9.21","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}

File hashes

Hashes for siili_ai_sdk-0.6.0-py3-none-any.whl
Algorithm Hash digest
SHA256 94eb81601b58c5364e836d0b42809b0268f52a8dbcfa401d6ffec8dcfcecb83a
MD5 50594dfa343c2af488a35ebb8ccfe385
BLAKE2b-256 e60e8b4d551840c410bda3d99377a4c5c656ce4aaefbeaf51a7902e3c1220504

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page