Skip to main content

Python SDK for building AI agents with multi-LLM support, streaming, and production-ready infrastructure

Project description

Siili AI SDK

A Python SDK for building AI agents with multi-LLM support, streaming capabilities, and production-ready infrastructure.

Features

  • Multi-LLM Support: OpenAI, Anthropic, Google, Azure models
  • Streaming: Real-time response streaming
  • Tools: Custom tool integration with LangChain
  • Production Ready: FastAPI server with REST/SSE APIs
  • Type Safe: Full type annotations

Quick Start

Simple Agent

from dotenv import load_dotenv
from siili_ai_sdk.agent.base_agent import BaseAgent

class HelloAgent(BaseAgent):
    def __init__(self):
        super().__init__(system_prompt="You're an unhelpful assistant that cant resist constantly talking about cats.")

if __name__ == "__main__":
    load_dotenv()
    agent = HelloAgent()
    print(agent.get_response_text("Hello"))

Agent with Tools + FastAPI Server

from typing import List
import uvicorn
from dotenv import load_dotenv
from fastapi import FastAPI
from fastapi.middleware.cors import CORSMiddleware

from siili_ai_sdk.agent.base_agent import BaseAgent
from siili_ai_sdk.tools.tool_provider import ToolProvider, tool, BaseTool
from siili_ai_sdk.server.api.routers.api_builder import ApiBuilder
from siili_ai_sdk.models.model_registry import ModelRegistry

load_dotenv()

app = FastAPI(title="Agent SDK Backend Example", version="0.0.1")

app.add_middleware(
    CORSMiddleware,
    allow_origins=["*"],
    allow_credentials=True,
    allow_methods=["*"],
    allow_headers=["*"],
)

class DemoTool(ToolProvider):
    def get_tools(self) -> List[BaseTool]:
        @tool
        def get_secret_greeting() -> str:
            """Returns the users secret greeting."""
            return "kikkelis kokkelis"

        @tool
        def get_user_name() -> str:
            """Returns the users name."""
            return "Seppo Hovi"

        return [get_secret_greeting, get_user_name]

class DemoAgent(BaseAgent):
    def get_tool_providers(self) -> List[ToolProvider]:
        return [DemoTool()]

@app.on_event("startup")
async def startup_event():
    agent = DemoAgent(llm_model=ModelRegistry.CLAUDE_4_SONNET)
    api_builder = ApiBuilder.local(agent=agent)
    thread_router = api_builder.build_thread_router()
    app.include_router(thread_router)

def run_server():
    uvicorn.run("main:app", host="0.0.0.0", port=8000, reload=True, log_level="info")

if __name__ == "__main__":
    run_server()

Configuration

Set up your environment variables:

# API Keys
ANTHROPIC_API_KEY=your-anthropic-key
OPENAI_API_KEY=your-openai-key  
GOOGLE_API_KEY=your-google-key

# Azure (optional)
AZURE_API_KEY=your-azure-key
AZURE_ENDPOINT=https://your-resource.openai.azure.com/

Available Models

from siili_ai_sdk.models.model_registry import ModelRegistry

# Use model registry for easy access
ModelRegistry.CLAUDE_4_SONNET
ModelRegistry.GPT_4_1
ModelRegistry.GEMINI_2_5_FLASH
ModelRegistry.O4_MINI

# Or use aliases
ModelRegistry.from_name("sonnet")      # -> CLAUDE_4_SONNET
ModelRegistry.from_name("gpt 4.1")     # -> GPT_4_1

More Usage Examples

Streaming Response

async def stream_example():
    agent = DemoAgent()
    async for chunk in agent.get_response_stream("Write a story"):
        print(chunk.content, end="", flush=True)

Structured Response

from pydantic import BaseModel

class Recipe(BaseModel):
    name: str
    ingredients: list[str]
    instructions: list[str]

response = agent.get_structured_response("Create a pasta recipe", Recipe)
print(f"Recipe: {response.name}")

Interactive CLI

agent = DemoAgent()
agent.run_cli()  # Starts interactive chat

API Endpoints

When running the FastAPI server:

# Health check
GET /health

# Create thread
POST /threads

# Send message (streaming)
POST /threads/{thread_id}/messages/stream

# Get messages
GET /threads/{thread_id}/messages

Development

# Setup
./setup.sh

# Run tests
make test

# Format code
ruff check --fix
ruff format

Project Structure

siili_ai_sdk/
├── agent/          # Core agent implementations
├── llm/            # LangChain service and streaming
├── models/         # LLM providers and configuration
├── thread/         # Conversation management
├── tools/          # Tool system
├── server/         # FastAPI server infrastructure
└── utils/          # Shared utilities

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

siili_ai_sdk-0.4.0.tar.gz (450.5 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

siili_ai_sdk-0.4.0-py3-none-any.whl (103.6 kB view details)

Uploaded Python 3

File details

Details for the file siili_ai_sdk-0.4.0.tar.gz.

File metadata

  • Download URL: siili_ai_sdk-0.4.0.tar.gz
  • Upload date:
  • Size: 450.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.9.21 {"installer":{"name":"uv","version":"0.9.21","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}

File hashes

Hashes for siili_ai_sdk-0.4.0.tar.gz
Algorithm Hash digest
SHA256 62e84d330255ca027f5bd63f52368b6ba913036646b025fe98007d9970f29d93
MD5 f42f98c296da833e0f3bacf2bc20b622
BLAKE2b-256 d7fb03dcaaca993ef0d6c0653fb46abcafcd4c37f6a4b173a382e15eec9dddd0

See more details on using hashes here.

File details

Details for the file siili_ai_sdk-0.4.0-py3-none-any.whl.

File metadata

  • Download URL: siili_ai_sdk-0.4.0-py3-none-any.whl
  • Upload date:
  • Size: 103.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.9.21 {"installer":{"name":"uv","version":"0.9.21","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}

File hashes

Hashes for siili_ai_sdk-0.4.0-py3-none-any.whl
Algorithm Hash digest
SHA256 1c8a9361476d25c3eac8ddb6d10eb2e6167264fa1f551e58d9b2700a486a0dfe
MD5 071036441ae74dafb6c70e34ea8ed49f
BLAKE2b-256 708d860e47e63c611d6505e331e6a3c352e9df94d09ad3b64a662a88ccfa0170

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page