Skip to main content

Build, debug, evaluate, and operate AI agents. The only SDK with fork-and-rerun Agent Replay.

Project description

FastAIAgent SDK

Build, debug, evaluate, and operate AI agents. The only SDK with Agent Replay — fork-and-rerun debugging for AI agents.

Works standalone or connected to the FastAIAgent Platform for visual editing, production monitoring, and team collaboration.

PyPI License Tests Python


Debug a failing agent in 30 seconds

from fastaiagent.trace import Replay

# Load a trace from a production failure
replay = Replay.load("trace_abc123")

# Step through to find the problem
replay.step_through()
# Step 3: LLM hallucinated the refund policy ← found it

# Fork at the failing step, fix, rerun
forked = replay.fork_at(step=3)
forked.modify_prompt("Always cite the exact policy section...")
result = forked.rerun()

No other SDK can do this.

Evaluate agents systematically

from fastaiagent.eval import evaluate

results = evaluate(
    agent_fn=my_agent.run,
    dataset="test_cases.jsonl",
    scorers=["correctness", "relevance"]
)
print(results.summary())
# correctness: 92% | relevance: 88%

Trace any agent — yours or LangChain/CrewAI

import fastaiagent
fastaiagent.integrations.langchain.enable()

# Your existing LangChain agent, now with full tracing
result = langchain_agent.invoke({"input": "..."})
# → Traces stored locally or pushed to FastAIAgent Platform

Build agents with guardrails and cyclic workflows

from fastaiagent import Agent, Chain, LLMClient, Guardrail
from fastaiagent.guardrail import no_pii, json_valid

agent = Agent(
    name="support-bot",
    system_prompt="You are a helpful support agent...",
    llm=LLMClient(provider="openai", model="gpt-4o"),
    tools=[search_tool, refund_tool],
    guardrails=[no_pii(), json_valid()]
)

# Chains with loops (retry until quality is good enough)
chain = Chain("support-pipeline", state_schema=SupportState)
chain.add_node("research", agent=researcher)
chain.add_node("evaluate", agent=evaluator)
chain.add_node("respond", agent=responder)
chain.connect("research", "evaluate")
chain.connect("evaluate", "research", max_iterations=3, exit_condition="quality >= 0.8")
chain.connect("evaluate", "respond", condition="quality >= 0.8")

result = chain.execute({"message": "My order is late"}, trace=True)

Multi-agent teams with context

from fastaiagent import Agent, LLMClient, RunContext, Supervisor, Worker, tool

@tool(name="get_tickets")
def get_tickets(ctx: RunContext[AppState], status: str) -> str:
    """Get support tickets for the current user."""
    return ctx.state.db.query("tickets", user_id=ctx.state.user_id, status=status)

support = Agent(name="support", llm=llm, tools=[get_tickets], system_prompt="Handle tickets.")
billing = Agent(name="billing", llm=llm, tools=[get_billing], system_prompt="Handle billing.")

supervisor = Supervisor(
    name="customer-service",
    llm=LLMClient(provider="openai", model="gpt-4o"),
    workers=[
        Worker(agent=support, role="support", description="Manages tickets"),
        Worker(agent=billing, role="billing", description="Handles billing"),
    ],
    system_prompt=lambda ctx: f"You lead support for {ctx.state.company}. Be helpful.",
)

# Context flows to all workers and their tools
ctx = RunContext(state=AppState(db=db, user_id="u-1", company="Acme"))
result = supervisor.run("Show my open tickets and billing", context=ctx)

# Stream the supervisor's response
async for event in supervisor.astream("Help me", context=ctx):
    if isinstance(event, TextDelta):
        print(event.text, end="")

Connect to FastAIAgent Platform (optional)

from fastaiagent import FastAI

fa = FastAI(api_key="sk-...", project="customer-support")

# Push your agent to the platform — see it in the visual editor
fa.push(chain)

# Traces appear in the platform dashboard
# Prompts sync between code and platform
# Eval results visible in the platform

SDK works standalone. Platform adds: visual chain editor, production monitoring, advanced KB intelligence, prompt optimization, team collaboration, HITL approval workflows.

Free tier available →


Install

pip install fastaiagent

With optional integrations:

pip install "fastaiagent[openai]"       # OpenAI auto-tracing
pip install "fastaiagent[langchain]"    # LangChain auto-tracing
pip install "fastaiagent[kb]"           # Local knowledge base
pip install "fastaiagent[all]"          # Everything

Documentation

Contributing

We welcome contributions! See CONTRIBUTING.md for guidelines.

License

Apache 2.0 — see LICENSE.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

fastaiagent-0.1.0a7.tar.gz (184.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

fastaiagent-0.1.0a7-py3-none-any.whl (94.1 kB view details)

Uploaded Python 3

File details

Details for the file fastaiagent-0.1.0a7.tar.gz.

File metadata

  • Download URL: fastaiagent-0.1.0a7.tar.gz
  • Upload date:
  • Size: 184.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.12

File hashes

Hashes for fastaiagent-0.1.0a7.tar.gz
Algorithm Hash digest
SHA256 dabcd644109f82ce2917595b965f57a2a072bb4c363eccba2705acfd89ea330b
MD5 d4cfee762b52d605f7aeeeb7960d6bbe
BLAKE2b-256 7af71e0ebcfb840342e09b3541105f8ac4fda03a3a03bf72f47355f92523091c

See more details on using hashes here.

File details

Details for the file fastaiagent-0.1.0a7-py3-none-any.whl.

File metadata

  • Download URL: fastaiagent-0.1.0a7-py3-none-any.whl
  • Upload date:
  • Size: 94.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.12

File hashes

Hashes for fastaiagent-0.1.0a7-py3-none-any.whl
Algorithm Hash digest
SHA256 6f990953a41422ea2ed86a53d7d04ca1100169d613f623b9d1cf354360295588
MD5 49780e5e2838eb9fdc82c011f941bb25
BLAKE2b-256 c13b389b8f7e9d96127a33d0e690a6452e46d321e1f25777a952f6f3e6230593

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page