Skip to main content

Python SDK for building AI agents with multi-LLM support, streaming, and production-ready infrastructure

Project description

Siili AI SDK

A Python SDK for building AI agents with multi-LLM support, streaming capabilities, and production-ready infrastructure.

Features

  • Multi-LLM Support: OpenAI, Anthropic, Google, Azure models
  • Streaming: Real-time response streaming
  • Tools: Custom tool integration with LangChain
  • Production Ready: FastAPI server with REST/SSE APIs
  • Type Safe: Full type annotations

Quick Start

Simple Agent

from dotenv import load_dotenv
from siili_ai_sdk.agent.base_agent import BaseAgent

class HelloAgent(BaseAgent):
    def __init__(self):
        super().__init__(system_prompt="You're an unhelpful assistant that cant resist constantly talking about cats.")

if __name__ == "__main__":
    load_dotenv()
    agent = HelloAgent()
    print(agent.get_response_text("Hello"))

Agent with Tools + FastAPI Server

from typing import List
import uvicorn
from dotenv import load_dotenv
from fastapi import FastAPI
from fastapi.middleware.cors import CORSMiddleware

from siili_ai_sdk.agent.base_agent import BaseAgent
from siili_ai_sdk.tools.tool_provider import ToolProvider, tool, BaseTool
from siili_ai_sdk.server.api.routers.api_builder import ApiBuilder
from siili_ai_sdk.models.model_registry import ModelRegistry

load_dotenv()

app = FastAPI(title="Agent SDK Backend Example", version="0.0.1")

app.add_middleware(
    CORSMiddleware,
    allow_origins=["*"],
    allow_credentials=True,
    allow_methods=["*"],
    allow_headers=["*"],
)

class DemoTool(ToolProvider):
    def get_tools(self) -> List[BaseTool]:
        @tool
        def get_secret_greeting() -> str:
            """Returns the users secret greeting."""
            return "kikkelis kokkelis"

        @tool
        def get_user_name() -> str:
            """Returns the users name."""
            return "Seppo Hovi"

        return [get_secret_greeting, get_user_name]

class DemoAgent(BaseAgent):
    def get_tool_providers(self) -> List[ToolProvider]:
        return [DemoTool()]

@app.on_event("startup")
async def startup_event():
    agent = DemoAgent(llm_model=ModelRegistry.CLAUDE_4_SONNET)
    api_builder = ApiBuilder.local(agent=agent)
    thread_router = api_builder.build_thread_router()
    app.include_router(thread_router)

def run_server():
    uvicorn.run("main:app", host="0.0.0.0", port=8000, reload=True, log_level="info")

if __name__ == "__main__":
    run_server()

Configuration

Set up your environment variables:

# API Keys
ANTHROPIC_API_KEY=your-anthropic-key
OPENAI_API_KEY=your-openai-key  
GOOGLE_API_KEY=your-google-key

# Azure (optional)
AZURE_API_KEY=your-azure-key
AZURE_ENDPOINT=https://your-resource.openai.azure.com/

Available Models

from siili_ai_sdk.models.model_registry import ModelRegistry

# Use model registry for easy access
ModelRegistry.CLAUDE_4_SONNET
ModelRegistry.GPT_4_1
ModelRegistry.GEMINI_2_5_FLASH
ModelRegistry.O4_MINI

# Or use aliases
ModelRegistry.from_name("sonnet")      # -> CLAUDE_4_SONNET
ModelRegistry.from_name("gpt 4.1")     # -> GPT_4_1

More Usage Examples

Streaming Response

async def stream_example():
    agent = DemoAgent()
    async for chunk in agent.get_response_stream("Write a story"):
        print(chunk.content, end="", flush=True)

Structured Response

from pydantic import BaseModel

class Recipe(BaseModel):
    name: str
    ingredients: list[str]
    instructions: list[str]

response = agent.get_structured_response("Create a pasta recipe", Recipe)
print(f"Recipe: {response.name}")

Interactive CLI

agent = DemoAgent()
agent.run_cli()  # Starts interactive chat

API Endpoints

When running the FastAPI server:

# Health check
GET /health

# Create thread
POST /threads

# Send message (streaming)
POST /threads/{thread_id}/messages/stream

# Get messages
GET /threads/{thread_id}/messages

Development

# Setup
./setup.sh

# Run tests
make test

# Format code
ruff check --fix
ruff format

Project Structure

siili_ai_sdk/
├── agent/          # Core agent implementations
├── llm/            # LangChain service and streaming
├── models/         # LLM providers and configuration
├── thread/         # Conversation management
├── tools/          # Tool system
├── server/         # FastAPI server infrastructure
└── utils/          # Shared utilities

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

siili_ai_sdk-0.5.2.tar.gz (455.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

siili_ai_sdk-0.5.2-py3-none-any.whl (112.7 kB view details)

Uploaded Python 3

File details

Details for the file siili_ai_sdk-0.5.2.tar.gz.

File metadata

  • Download URL: siili_ai_sdk-0.5.2.tar.gz
  • Upload date:
  • Size: 455.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.9.21 {"installer":{"name":"uv","version":"0.9.21","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}

File hashes

Hashes for siili_ai_sdk-0.5.2.tar.gz
Algorithm Hash digest
SHA256 18bb745b2c4ed7950a5251b7bc22de691ab32d7b1bcb483928a25594c41427c3
MD5 b87d2abb82cb6a6fdb8c90473e9ef36b
BLAKE2b-256 31887f832a4b91b5cfedc75c588539b79757e338909f0d8f5e63a7d19387b8ea

See more details on using hashes here.

File details

Details for the file siili_ai_sdk-0.5.2-py3-none-any.whl.

File metadata

  • Download URL: siili_ai_sdk-0.5.2-py3-none-any.whl
  • Upload date:
  • Size: 112.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.9.21 {"installer":{"name":"uv","version":"0.9.21","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}

File hashes

Hashes for siili_ai_sdk-0.5.2-py3-none-any.whl
Algorithm Hash digest
SHA256 68a4257f96c08fdabae44e61416f88ae5a15a71138a3a211d3fca6c06fd507d4
MD5 5f4ce8a5c3bcc1a3b0caaa79a9df8dd9
BLAKE2b-256 f4d9388884980f99bb37d5375bc53a3cc457c36a292f3841a6bebbf1731e139e

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page