FigureOut - a lightweight and resilient multi-LLM orchestrator
Project description
FigureOut - a lightweight and resilient multi-LLM orchestrator
FigureOut is a lightweight, modular orchestrator for developers who want to build LLM workflows without the framework bloat. Unlike heavy frameworks that hide logic behind abstractions, FigureOut keeps your code clean, predictable, and easy to debug. Perfect for prototyping and production-grade apps where you need full control.
Overview
FigureOut has no hardcoded domain knowledge. You supply your own roles, system prompts, output schemas, and classifier guidelines via RoleDefinition. The package classifies incoming queries and dispatches them to the appropriate role, returning a structured JSON response.
Installation
Install the base package plus the extra for your LLM provider:
pip install figureout[openai] # OpenAI
pip install figureout[gemini] # Google Gemini
pip install figureout[claude] # Anthropic Claude
pip install figureout[meta] # Meta (Llama)
pip install figureout[mistral] # Mistral
pip install figureout[groq] # Groq
pip install figureout[all] # All providers
Supported LLM Providers
| Provider | Enum | API Key Env Var | Install Extra |
|---|---|---|---|
| OpenAI | LLM.OPENAI |
OPENAI_API_KEY |
pip install figureout[openai] |
| Google Gemini | LLM.GEMINI |
GEMINI_API_KEY |
pip install figureout[gemini] |
| Anthropic Claude | LLM.CLAUDE |
ANTHROPIC_API_KEY |
pip install figureout[claude] |
| Meta (Llama) | LLM.META |
META_API_KEY |
pip install figureout[meta] |
| Mistral | LLM.MISTRAL |
MISTRAL_API_KEY |
pip install figureout[mistral] |
| Groq | LLM.GROQ |
GROQ_API_KEY |
pip install figureout[groq] |
Only install the extra for the provider(s) you intend to use.
Quick Start
import asyncio
from figureout import FigureOut, RoleDefinition
concierge = FigureOut(
llm="openai",
llm_version="gpt-4o",
roles={
"product_search": RoleDefinition(
prompt="You are a product search specialist. Help users find products.",
schema='{"results": [{"id": int, "name": str, "price": float}], "summary": str}',
guideline="queries about finding or searching for products",
),
"off_topic": RoleDefinition(
prompt="Politely decline and explain you can only help with product searches.",
schema='{"message": str}',
guideline="anything unrelated to products",
),
},
)
result = asyncio.run(concierge.run("Find me a blue running shoe under $100"))
print(result)
# {"results": [...], "summary": "..."}
API
FigureOut Constructor
FigureOut(
llm: str | LLM,
llm_version: str,
roles: dict[str, RoleDefinition],
lite_llm_version: str | None = None,
verbose: bool = False,
max_roles: int = 1,
max_output_tokens: int = 16384,
max_retries: int = 2,
max_tool_rounds: int = 3,
interpret_tool_response: bool | None = None,
cache_enabled: bool = True,
cache_size: int = 128,
inject_date: bool = True,
api_key: str | None = None,
mcp_server=None,
)
All constructor params can also be set via environment variables:
| Param | Env Var |
|---|---|
llm |
FIGUREOUT_LLM |
llm_version |
FIGUREOUT_LLM_VERSION |
lite_llm_version |
FIGUREOUT_LITE_LLM_VERSION |
verbose |
FIGUREOUT_VERBOSE |
max_roles |
FIGUREOUT_MAX_ROLES |
max_output_tokens |
FIGUREOUT_MAX_OUTPUT_TOKENS |
max_retries |
FIGUREOUT_MAX_RETRIES |
max_tool_rounds |
FIGUREOUT_MAX_TOOL_ROUNDS |
interpret_tool_response |
FIGUREOUT_INTERPRET_TOOL_RESPONSE |
cache_enabled |
FIGUREOUT_CACHE_ENABLED |
cache_size |
FIGUREOUT_CACHE_SIZE |
inject_date |
FIGUREOUT_INJECT_DATE |
RoleDefinition
RoleDefinition(
prompt: str, # System prompt for this role
schema: str, # JSON schema string defining the output shape
guideline: str, # Short description used by the classifier to select this role
)
run() Method
await concierge.run(
user_query: str,
role: str | None = None, # Skip classification, use this role directly
context: str | None = None, # Prepended to user_query as additional context
output_schema: str | None = None # Override the role's default schema
)
Returns a dict matching the role's output schema.
Multi-Role Queries
Set max_roles > 1 to allow multiple roles to handle a single query in parallel:
concierge = FigureOut(llm="openai", llm_version="gpt-4o", roles=ROLES, max_roles=3)
result = await concierge.run("Find concerts and sports events this weekend")
# {"concert_discovery": {...}, "sports_discovery": {...}}
When a single role is selected, the result is returned directly (not wrapped in a role-keyed dict).
off_topic Role
Define an "off_topic" role to control fallback behavior. The classifier never selects it directly — it is used automatically when no other role matches:
"off_topic": RoleDefinition(
prompt="Politely decline.",
schema='{"message": str}',
guideline="anything unrelated",
)
Tool Use (FastMCP)
Pass a FastMCP server to enable tool calling:
from fastmcp import FastMCP
mcp = FastMCP("my-server")
@mcp.tool()
def search_products(query: str) -> list:
...
concierge = FigureOut(llm="gemini", llm_version="gemini-2.0-flash", roles=ROLES, mcp_server=mcp)
interpret_tool_response
Controls how tool results are handled after tool execution:
| Value | Behavior |
|---|---|
None (default) |
Normal loop — LLM responds naturally after tool calls |
True |
Bridge messages appended after tool calls + forced JSON mode round; ensures nested fields are populated |
False |
Return raw tool output without sending back to the LLM |
Use interpret_tool_response=True for roles that return complex nested JSON structures.
Verbose Mode
With verbose=True, the response is wrapped under a "response" key and a "debug" key is added with token usage and timing information:
concierge = FigureOut(..., verbose=True)
result = await concierge.run("Find flights to Tokyo")
# {"response": {"flights": [...]}, "debug": {"input_tokens": 512, "output_tokens": 256, ...}}
Examples
The examples/ directory contains complete domain-specific backends with React+Vite frontends:
| Example | Domain |
|---|---|
event-sports-booking/ |
Sports event discovery and booking |
flight-booking/ |
Flight search and booking |
restaurant-reservation/ |
Restaurant discovery and reservations |
things-to-do/ |
Activity and places discovery |
customer-support-chat/ |
FAQ-based customer support chat |
To run an example:
cd examples/<example-name>/mcp-server
python app.py
Development
pip install -e ".[dev]"
pytest
pytest tests/test_figureout.py::test_name # Run a single test
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file figureout-0.2.0.tar.gz.
File metadata
- Download URL: figureout-0.2.0.tar.gz
- Upload date:
- Size: 20.2 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
0c7c5886d8cc161a41b59880da92a2b3b3a99589a77d0d7e70cdcb0f13149f71
|
|
| MD5 |
d4287c6687edb7899d8d44287deb638e
|
|
| BLAKE2b-256 |
693c59adfc48d8416685757bf193c1761e41c5bcbd734982d25aec3dcf77a33f
|
Provenance
The following attestation bundles were made for figureout-0.2.0.tar.gz:
Publisher:
publish.yml on balajeekalyan/figureout
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
figureout-0.2.0.tar.gz -
Subject digest:
0c7c5886d8cc161a41b59880da92a2b3b3a99589a77d0d7e70cdcb0f13149f71 - Sigstore transparency entry: 1113958634
- Sigstore integration time:
-
Permalink:
balajeekalyan/figureout@6d7018ab89b5e2f53f9910367723a66031d433f4 -
Branch / Tag:
refs/tags/v0.2.0 - Owner: https://github.com/balajeekalyan
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@6d7018ab89b5e2f53f9910367723a66031d433f4 -
Trigger Event:
push
-
Statement type:
File details
Details for the file figureout-0.2.0-py3-none-any.whl.
File metadata
- Download URL: figureout-0.2.0-py3-none-any.whl
- Upload date:
- Size: 17.2 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
8841810325eff4b630323544050f2487e8e01fd31f74f2bb61de356fe52111da
|
|
| MD5 |
f53abb7a616b802f35a8a65c3d0923e3
|
|
| BLAKE2b-256 |
0b6481f3abd08865c2eea343f3041f2f566f50e8bc27044e6043d1da7290b109
|
Provenance
The following attestation bundles were made for figureout-0.2.0-py3-none-any.whl:
Publisher:
publish.yml on balajeekalyan/figureout
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
figureout-0.2.0-py3-none-any.whl -
Subject digest:
8841810325eff4b630323544050f2487e8e01fd31f74f2bb61de356fe52111da - Sigstore transparency entry: 1113958690
- Sigstore integration time:
-
Permalink:
balajeekalyan/figureout@6d7018ab89b5e2f53f9910367723a66031d433f4 -
Branch / Tag:
refs/tags/v0.2.0 - Owner: https://github.com/balajeekalyan
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@6d7018ab89b5e2f53f9910367723a66031d433f4 -
Trigger Event:
push
-
Statement type: