Skip to main content

FigureOut - a lightweight and resilient multi-LLM orchestrator

Project description

FigureOut - a lightweight and resilient multi-LLM orchestrator

PyPI version License PyPI Downloads

FigureOut is a lightweight, modular orchestrator for developers who want to build LLM workflows without the framework bloat. Unlike heavy frameworks that hide logic behind abstractions, FigureOut keeps your code clean, predictable, and easy to debug. Perfect for prototyping and production-grade apps where you need full control.

Overview

FigureOut has no hardcoded domain knowledge. You supply your own roles, system prompts, output schemas, and classifier guidelines via RoleDefinition. The package classifies incoming queries and dispatches them to the appropriate role, returning a structured JSON response.

Installation

Install the base package plus the extra for your LLM provider:

pip install figureout[openai]   # OpenAI
pip install figureout[gemini]   # Google Gemini
pip install figureout[claude]   # Anthropic Claude
pip install figureout[meta]     # Meta (Llama)
pip install figureout[mistral]  # Mistral
pip install figureout[groq]     # Groq
pip install figureout[all]      # All providers

Supported LLM Providers

Provider Enum API Key Env Var Install Extra
OpenAI LLM.OPENAI OPENAI_API_KEY pip install figureout[openai]
Google Gemini LLM.GEMINI GEMINI_API_KEY pip install figureout[gemini]
Anthropic Claude LLM.CLAUDE ANTHROPIC_API_KEY pip install figureout[claude]
Meta (Llama) LLM.META META_API_KEY pip install figureout[meta]
Mistral LLM.MISTRAL MISTRAL_API_KEY pip install figureout[mistral]
Groq LLM.GROQ GROQ_API_KEY pip install figureout[groq]

Only install the extra for the provider(s) you intend to use.

Quick Start

import asyncio
from figureout import FigureOut, RoleDefinition

concierge = FigureOut(
    llm="openai",
    llm_version="gpt-4o",
    roles={
        "product_search": RoleDefinition(
            prompt="You are a product search specialist. Help users find products.",
            schema='{"results": [{"id": int, "name": str, "price": float}], "summary": str}',
            guideline="queries about finding or searching for products",
        ),
        "off_topic": RoleDefinition(
            prompt="Politely decline and explain you can only help with product searches.",
            schema='{"message": str}',
            guideline="anything unrelated to products",
        ),
    },
)

result = asyncio.run(concierge.run("Find me a blue running shoe under $100"))
print(result)
# {"results": [...], "summary": "..."}

API

FigureOut Constructor

FigureOut(
    llm: str | LLM,
    llm_version: str,
    roles: dict[str, RoleDefinition],
    lite_llm_version: str | None = None,
    verbose: bool = False,
    max_roles: int = 1,
    max_output_tokens: int = 16384,
    max_retries: int = 2,
    max_tool_rounds: int = 3,
    interpret_tool_response: bool | None = None,
    cache_enabled: bool = True,
    cache_size: int = 128,
    inject_date: bool = True,
    api_key: str | None = None,
    mcp_server=None,
)

All constructor params can also be set via environment variables:

Param Env Var
llm FIGUREOUT_LLM
llm_version FIGUREOUT_LLM_VERSION
lite_llm_version FIGUREOUT_LITE_LLM_VERSION
verbose FIGUREOUT_VERBOSE
max_roles FIGUREOUT_MAX_ROLES
max_output_tokens FIGUREOUT_MAX_OUTPUT_TOKENS
max_retries FIGUREOUT_MAX_RETRIES
max_tool_rounds FIGUREOUT_MAX_TOOL_ROUNDS
interpret_tool_response FIGUREOUT_INTERPRET_TOOL_RESPONSE
cache_enabled FIGUREOUT_CACHE_ENABLED
cache_size FIGUREOUT_CACHE_SIZE
inject_date FIGUREOUT_INJECT_DATE

RoleDefinition

RoleDefinition(
    prompt: str,      # System prompt for this role
    schema: str,      # JSON schema string defining the output shape
    guideline: str,   # Short description used by the classifier to select this role
)

run() Method

await concierge.run(
    user_query: str,
    role: str | None = None,         # Skip classification, use this role directly
    context: str | None = None,      # Prepended to user_query as additional context
    output_schema: str | None = None # Override the role's default schema
)

Returns a dict matching the role's output schema.

Multi-Role Queries

Set max_roles > 1 to allow multiple roles to handle a single query in parallel:

concierge = FigureOut(llm="openai", llm_version="gpt-4o", roles=ROLES, max_roles=3)
result = await concierge.run("Find concerts and sports events this weekend")
# {"concert_discovery": {...}, "sports_discovery": {...}}

When a single role is selected, the result is returned directly (not wrapped in a role-keyed dict).

off_topic Role

Define an "off_topic" role to control fallback behavior. The classifier never selects it directly — it is used automatically when no other role matches:

"off_topic": RoleDefinition(
    prompt="Politely decline.",
    schema='{"message": str}',
    guideline="anything unrelated",
)

Tool Use (FastMCP)

Pass a FastMCP server to enable tool calling:

from fastmcp import FastMCP

mcp = FastMCP("my-server")

@mcp.tool()
def search_products(query: str) -> list:
    ...

concierge = FigureOut(llm="gemini", llm_version="gemini-2.0-flash", roles=ROLES, mcp_server=mcp)

interpret_tool_response

Controls how tool results are handled after tool execution:

Value Behavior
None (default) Normal loop — LLM responds naturally after tool calls
True Bridge messages appended after tool calls + forced JSON mode round; ensures nested fields are populated
False Return raw tool output without sending back to the LLM

Use interpret_tool_response=True for roles that return complex nested JSON structures.

Verbose Mode

With verbose=True, the response is wrapped under a "response" key and a "debug" key is added with token usage and timing information:

concierge = FigureOut(..., verbose=True)
result = await concierge.run("Find flights to Tokyo")
# {"response": {"flights": [...]}, "debug": {"input_tokens": 512, "output_tokens": 256, ...}}

Examples

The examples/ directory contains complete domain-specific backends with React+Vite frontends:

Example Domain
event-sports-booking/ Sports event discovery and booking
flight-booking/ Flight search and booking
restaurant-reservation/ Restaurant discovery and reservations
things-to-do/ Activity and places discovery
customer-support-chat/ FAQ-based customer support chat

To run an example:

cd examples/<example-name>/mcp-server
python app.py

Development

pip install -e ".[dev]"
pytest
pytest tests/test_figureout.py::test_name  # Run a single test

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

figureout-0.2.0.tar.gz (20.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

figureout-0.2.0-py3-none-any.whl (17.2 kB view details)

Uploaded Python 3

File details

Details for the file figureout-0.2.0.tar.gz.

File metadata

  • Download URL: figureout-0.2.0.tar.gz
  • Upload date:
  • Size: 20.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for figureout-0.2.0.tar.gz
Algorithm Hash digest
SHA256 0c7c5886d8cc161a41b59880da92a2b3b3a99589a77d0d7e70cdcb0f13149f71
MD5 d4287c6687edb7899d8d44287deb638e
BLAKE2b-256 693c59adfc48d8416685757bf193c1761e41c5bcbd734982d25aec3dcf77a33f

See more details on using hashes here.

Provenance

The following attestation bundles were made for figureout-0.2.0.tar.gz:

Publisher: publish.yml on balajeekalyan/figureout

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file figureout-0.2.0-py3-none-any.whl.

File metadata

  • Download URL: figureout-0.2.0-py3-none-any.whl
  • Upload date:
  • Size: 17.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for figureout-0.2.0-py3-none-any.whl
Algorithm Hash digest
SHA256 8841810325eff4b630323544050f2487e8e01fd31f74f2bb61de356fe52111da
MD5 f53abb7a616b802f35a8a65c3d0923e3
BLAKE2b-256 0b6481f3abd08865c2eea343f3041f2f566f50e8bc27044e6043d1da7290b109

See more details on using hashes here.

Provenance

The following attestation bundles were made for figureout-0.2.0-py3-none-any.whl:

Publisher: publish.yml on balajeekalyan/figureout

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page