Skip to main content

FigureOut - a lightweight and resilient multi-LLM orchestrator

Project description

FigureOut - a lightweight and resilient multi-LLM orchestrator

PyPI version License PyPI Downloads

FigureOut is a lightweight, modular orchestrator for developers who want to build LLM workflows without the framework bloat. Unlike heavy frameworks that hide logic behind abstractions, FigureOut keeps your code clean, predictable, and easy to debug. Perfect for prototyping and production-grade apps where you need full control.

Overview

FigureOut has no hardcoded domain knowledge. You supply your own roles, system prompts, output schemas, and classifier guidelines via RoleDefinition. The package classifies incoming queries and dispatches them to the appropriate role, returning a structured JSON response.

Installation

Install the base package plus the extra for your LLM provider:

pip install figureout[openai]   # OpenAI, Meta (Llama), Mistral, or Groq
pip install figureout[gemini]   # Google Gemini
pip install figureout[claude]   # Anthropic Claude
pip install figureout[all]      # All providers

Supported LLM Providers

Provider Enum API Key Env Var Install Extra
OpenAI LLM.OPENAI OPENAI_API_KEY pip install figureout[openai]
Google Gemini LLM.GEMINI GEMINI_API_KEY pip install figureout[gemini]
Anthropic Claude LLM.CLAUDE ANTHROPIC_API_KEY pip install figureout[claude]
Meta (Llama) LLM.META META_API_KEY pip install figureout[openai]
Mistral LLM.MISTRAL MISTRAL_API_KEY pip install figureout[openai]
Groq LLM.GROQ GROQ_API_KEY pip install figureout[openai]

Only install the extra for the provider(s) you intend to use.

Quick Start

import asyncio
from figureout import FigureOut, RoleDefinition

concierge = FigureOut(
    llm="openai",
    llm_version="gpt-4o",
    roles={
        "product_search": RoleDefinition(
            prompt="You are a product search specialist. Help users find products.",
            schema='{"results": [{"id": int, "name": str, "price": float}], "summary": str}',
            guideline="queries about finding or searching for products",
        ),
        "off_topic": RoleDefinition(
            prompt="Politely decline and explain you can only help with product searches.",
            schema='{"message": str}',
            guideline="anything unrelated to products",
        ),
    },
)

result = asyncio.run(concierge.run("Find me a blue running shoe under $100"))
print(result)
# {"results": [...], "summary": "..."}

API

FigureOut Constructor

FigureOut(
    llm: str | LLM,
    llm_version: str,
    roles: dict[str, RoleDefinition],
    lite_llm_version: str | None = None,
    verbose: bool = False,
    max_roles: int = 1,
    max_output_tokens: int = 16384,
    max_retries: int = 2,
    max_tool_rounds: int = 3,
    interpret_tool_response: bool | None = None,
    cache_enabled: bool = True,
    cache_size: int = 128,
    inject_date: bool = True,
    api_key: str | None = None,
    mcp_server=None,
)

All constructor params can also be set via environment variables:

Param Env Var
llm FIGUREOUT_LLM
llm_version FIGUREOUT_LLM_VERSION
lite_llm_version FIGUREOUT_LITE_LLM_VERSION
verbose FIGUREOUT_VERBOSE
max_roles FIGUREOUT_MAX_ROLES
max_output_tokens FIGUREOUT_MAX_OUTPUT_TOKENS
max_retries FIGUREOUT_MAX_RETRIES
max_tool_rounds FIGUREOUT_MAX_TOOL_ROUNDS
interpret_tool_response FIGUREOUT_INTERPRET_TOOL_RESPONSE
cache_enabled FIGUREOUT_CACHE_ENABLED
cache_size FIGUREOUT_CACHE_SIZE
inject_date FIGUREOUT_INJECT_DATE

RoleDefinition

RoleDefinition(
    prompt: str,      # System prompt for this role
    schema: str,      # JSON schema string defining the output shape
    guideline: str,   # Short description used by the classifier to select this role
)

run() Method

await concierge.run(
    user_query: str,
    role: str | None = None,         # Skip classification, use this role directly
    context: str | None = None,      # Prepended to user_query as additional context
    output_schema: str | None = None # Override the role's default schema
)

Returns a dict matching the role's output schema.

Multi-Role Queries

Set max_roles > 1 to allow multiple roles to handle a single query in parallel:

concierge = FigureOut(llm="openai", llm_version="gpt-4o", roles=ROLES, max_roles=3)
result = await concierge.run("Find concerts and sports events this weekend")
# {"concert_discovery": {...}, "sports_discovery": {...}}

When a single role is selected, the result is returned directly (not wrapped in a role-keyed dict).

off_topic Role

Define an "off_topic" role to control fallback behavior. The classifier never selects it directly — it is used automatically when no other role matches:

"off_topic": RoleDefinition(
    prompt="Politely decline.",
    schema='{"message": str}',
    guideline="anything unrelated",
)

Tool Use (FastMCP)

Pass a FastMCP server to enable tool calling:

from fastmcp import FastMCP

mcp = FastMCP("my-server")

@mcp.tool()
def search_products(query: str) -> list:
    ...

concierge = FigureOut(llm="gemini", llm_version="gemini-2.0-flash", roles=ROLES, mcp_server=mcp)

interpret_tool_response

Controls how tool results are handled after tool execution:

Value Behavior
None (default) Normal loop — LLM responds naturally after tool calls
True Bridge messages appended after tool calls + forced JSON mode round; ensures nested fields are populated
False Return raw tool output without sending back to the LLM

Use interpret_tool_response=True for roles that return complex nested JSON structures.

Verbose Mode

With verbose=True, the response is wrapped under a "response" key and a "debug" key is added with token usage and timing information:

concierge = FigureOut(..., verbose=True)
result = await concierge.run("Find flights to Tokyo")
# {"response": {"flights": [...]}, "debug": {"input_tokens": 512, "output_tokens": 256, ...}}

Examples

The examples/ directory contains complete domain-specific backends with React+Vite frontends:

Example Domain
event-sports-booking/ Sports event discovery and booking
flight-booking/ Flight search and booking
restaurant-reservation/ Restaurant discovery and reservations
things-to-do/ Activity and places discovery
customer-support-chat/ FAQ-based customer support chat

To run an example:

cd examples/<example-name>/mcp-server
python app.py

Development

pip install -e ".[dev]"
pytest
pytest tests/test_figureout.py::test_name  # Run a single test

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

figureout-0.1.1.tar.gz (19.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

figureout-0.1.1-py3-none-any.whl (16.9 kB view details)

Uploaded Python 3

File details

Details for the file figureout-0.1.1.tar.gz.

File metadata

  • Download URL: figureout-0.1.1.tar.gz
  • Upload date:
  • Size: 19.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for figureout-0.1.1.tar.gz
Algorithm Hash digest
SHA256 ed32d20b80383df4da0267896e7eae65384b22a51adb827c617aef657211c62b
MD5 dc5612daaf2c17dccf5a5d2afac777d8
BLAKE2b-256 966e5496ed9634559f026716848d4e61b864337dd0fc1d0a229e7c70e2377971

See more details on using hashes here.

Provenance

The following attestation bundles were made for figureout-0.1.1.tar.gz:

Publisher: publish.yml on balajeekalyan/figureout

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file figureout-0.1.1-py3-none-any.whl.

File metadata

  • Download URL: figureout-0.1.1-py3-none-any.whl
  • Upload date:
  • Size: 16.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for figureout-0.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 c196ae9ddd78dbb3ca187e5b17fc36f6c30a58a8bfce90177f81b85ec17ff146
MD5 11d8ae468272e869727e227fec20affc
BLAKE2b-256 1af7324416ec3756a59998d0351647675aff01edaa87a5a980e09169dbc9ccfb

See more details on using hashes here.

Provenance

The following attestation bundles were made for figureout-0.1.1-py3-none-any.whl:

Publisher: publish.yml on balajeekalyan/figureout

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page