Skip to main content

A provider-agnostic LLM toolkit with tool calling, skills, and parallel execution.

Project description

llmstitch

A provider-agnostic LLM toolkit with tool calling, skills, and parallel execution.

Stitch together Anthropic, OpenAI, Gemini, Groq, and OpenRouter behind one Agent loop. Define tools with a decorator, compose behaviors as skills, and execute tool calls concurrently — all with a tiny, typed core.

Install

pip install llmstitch[anthropic]       # just the Anthropic SDK
pip install llmstitch[openai]          # just the OpenAI SDK
pip install llmstitch[gemini]          # just the Gemini SDK
pip install llmstitch[groq]            # just the Groq SDK
pip install llmstitch[openrouter]      # OpenRouter (reuses the openai SDK)
pip install llmstitch[all]             # all five

The bare pip install llmstitch has zero runtime dependencies — provider SDKs are opt-in extras.

30-second example

import asyncio
from llmstitch import Agent, tool
from llmstitch.providers.anthropic import AnthropicAdapter

@tool
def get_weather(city: str) -> str:
    """Return a canned weather report for the given city."""
    return f"{city}: 72°F and sunny"

agent = Agent(
    provider=AnthropicAdapter(),
    model="claude-opus-4-7",
    system="You are a helpful weather assistant.",
)
agent.tools.register(get_weather)

messages = asyncio.run(agent.run("What's the weather in Tokyo?"))
print(messages[-1].content)

Features

  • Provider-agnostic — swap AnthropicAdapter for OpenAIAdapter, GeminiAdapter, GroqAdapter, or OpenRouterAdapter without touching your agent code.
  • Typed @tool decorator — JSON Schema generated from type hints (Optional, Literal, defaults, async).
  • Parallel tool execution — when a model returns multiple tool calls in one turn, they run concurrently.
  • StreamingAgent.run_stream() yields provider-neutral events (TextDelta, ToolUseStart / Delta / Stop, MessageStop, terminal StreamDone) and handles tool execution between turns.
  • Skills — bundle a system prompt with a set of tools; compose with .extend().
  • PEP 561 typed — ships with py.typed, fully checked under mypy --strict.

Streaming example

import asyncio
from llmstitch import Agent, TextDelta, StreamDone
from llmstitch.providers.anthropic import AnthropicAdapter

async def main() -> None:
    agent = Agent(provider=AnthropicAdapter(), model="claude-opus-4-7")
    async for event in agent.run_stream("Tell me a haiku about streams."):
        if isinstance(event, TextDelta):
            print(event.text, end="", flush=True)
        elif isinstance(event, StreamDone):
            print(f"\n[stop_reason={event.response.stop_reason}]")

asyncio.run(main())

More examples

The examples/ directory has runnable scripts for:

Status

Alpha. Retries and MCP support are on the roadmap. See CHANGELOG.md for release history and ARCHITECTURE.md for a walkthrough of how the library is put together.

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

llmstitch-0.1.2.tar.gz (38.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

llmstitch-0.1.2-py3-none-any.whl (19.8 kB view details)

Uploaded Python 3

File details

Details for the file llmstitch-0.1.2.tar.gz.

File metadata

  • Download URL: llmstitch-0.1.2.tar.gz
  • Upload date:
  • Size: 38.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for llmstitch-0.1.2.tar.gz
Algorithm Hash digest
SHA256 93ef0ef94ea784ddc231ceb39d993800b041df0193f5b9778037d6e5930f7c55
MD5 6c2eb79fba311ef2e51865e6086cfec1
BLAKE2b-256 848ae3863207840fcabe29a8a69ad669204ed6d5d3249aaecbbbe1fea4c79dff

See more details on using hashes here.

Provenance

The following attestation bundles were made for llmstitch-0.1.2.tar.gz:

Publisher: release.yml on bengeos/llmstitch

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file llmstitch-0.1.2-py3-none-any.whl.

File metadata

  • Download URL: llmstitch-0.1.2-py3-none-any.whl
  • Upload date:
  • Size: 19.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for llmstitch-0.1.2-py3-none-any.whl
Algorithm Hash digest
SHA256 f354c58c5699026bb205a698ff8fb2e7bab6a8841763088f401055d597715421
MD5 354ff6b9346d7e9b17cf00941ac867cf
BLAKE2b-256 509d4047eaa5ebff286826ddf85588bfdc3790553cd3ae6adac592315c5da75f

See more details on using hashes here.

Provenance

The following attestation bundles were made for llmstitch-0.1.2-py3-none-any.whl:

Publisher: release.yml on bengeos/llmstitch

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page