Skip to main content

Batteries-included framework for building durable agentic workflows and business applications.

Project description

Planar

Planar is a batteries-included Python framework for building durable workflows, agent automations, and stateful APIs. Built on FastAPI and SQLModel, it combines orchestration, data modeling, and file management into a cohesive developer experience.

Feature Highlights

  • Durable workflow engine with resumable async steps, automatic retries, and suspension points
  • Agent step framework with first-class support for OpenAI, Anthropic, and other providers
  • Human task assignments and rule engine tooling baked into workflow execution
  • SQLModel-powered data layer with Alembic migrations and CRUD scaffolding out of the box
  • Built-in file management and storage adapters for local disk, Amazon S3, and Azure Blob Storage
  • CLI-driven developer workflow with templated scaffolding, hot reload, and environment-aware configuration
  • Agentic CLI that can scaffold or evolve workflows

Installation

Planar is published on PyPI. Add it to an existing project with uv:

uv add planar

To explore the CLI without updating pyproject.toml, use the ephemeral uvx runner:

uvx planar --help

Quickstart

Generate a new service, start up the dev server, and inspect the auto-generated APIs:

uvx planar scaffold --name my_service
cd my_service
uv run planar dev src/main.py

Open http://127.0.0.1:8000/docs to explore your service's routes and workflow endpoints. The scaffold prints the exact app path if it differs from src/main.py.

Define a Durable Workflow

from datetime import timedelta

from planar import PlanarApp
from planar.workflows import step, suspend, workflow

@step
async def charge_customer(order_id: str) -> None:
    ...

@step
async def notify_success(order_id: str) -> None:
    ...

@workflow
async def process_order(order_id: str) -> None:
    await charge_customer(order_id)
    await suspend(interval=timedelta(hours=1))
    await notify_success(order_id)


app = PlanarApp()
app.register_workflow(process_order)

Workflows are async functions composed of resumable steps. Planar persists every step, applies configurable retry policies, and resumes suspended workflows even after process restarts. Check docs/workflows.md for deeper concepts including event-driven waits, human steps, and agent integrations.

Core Capabilities

  • Workflow orchestration: Compose async steps with guaranteed persistence, scheduling, and concurrency control.
  • Agent steps: Run LLM-powered actions durably with provider-agnostic adapters and structured prompts.
  • Human tasks and rules: Build human-in-the-loop approvals and declarative rule evaluations alongside automated logic.
  • Stateful data and files: Model entities with SQLModel, manage migrations through Alembic, and store files using pluggable backends.
  • Observability: Structured logging and OpenTelemetry hooks surface workflow progress and performance metrics.

Command Line Interface

uvx planar scaffold --help   # generate a new project from the official template
uv run planar dev [PATH]     # run with hot reload and development defaults
uv run planar prod [PATH]    # run with production defaults
uv run planar agent [PROMPT] # scaffold or evolve workflows with Anthropic's Claude Code (requires Claude API key)

[PATH] points to the module that exports a PlanarApp instance (defaults to app.py or main.py). Use --config PATH to load a specific configuration file and --app NAME if your application variable is not named app.

Configuration

Planar merges environment defaults with an optional YAML override. By convention it looks for planar.dev.yaml, planar.prod.yaml, or planar.yaml in your project directory, but you can supply a path explicitly via --config or the PLANAR_CONFIG environment variable.

Example minimal override:

ai_models:
  default: invoice_llm
  providers:
    public_openai:
      factory: openai_responses
      options:
        api_key: ${OPENAI_API_KEY}
        base_url: ${OPENAI_PROXY_URL}
    azure_llm:
      factory: azure_openai_responses
      options:
        endpoint: ${AZURE_OPENAI_ENDPOINT}
        # optional: static_api_key: ${AZURE_OPENAI_KEY} (omit to use DefaultAzureCredential)
  models:
    invoice_llm:
      provider: public_openai
      options: gpt-4o-mini
    claims_llm:
      provider: azure_llm
      options:
        deployment: gpt-4o-claims

storage:
  directory: .files

Set default to the model key agents should use when they leave model=None. Use ConfiguredModelKey("invoice_llm") to reference a specific entry. Providers let you reuse auth/transport settings across multiple models; factories receive merged provider + model options.

ai_models:
  providers:
    public_openai:
      factory: openai_responses
      options:
        api_key_env: BILLING_OPENAI_KEY
  models:
    invoice_parsing_model:
      provider: public_openai
      options: gpt-4o-mini

For Azure OpenAI endpoints:

ai_models:
  providers:
    azure_llm:
      factory: azure_openai_responses
      options:
        endpoint_env: AZURE_OPENAI_ENDPOINT
        deployment_env: AZURE_OPENAI_DEPLOYMENT
        # Optional: omit these to use DefaultAzureCredential instead
        static_api_key: AZURE_OPENAI_KEY
  models:
    invoice_parsing_model:
      provider: azure_llm
      options:
        deployment: gpt-4o-mini

Omit the API key options to authenticate with DefaultAzureCredential (managed identity or user credentials). Use token_scope/token_scope_env if you need to override the default https://cognitiveservices.azure.com/.default scope.

Register custom factories on the app and reference them by key in config:

from planar import PlanarApp

app = PlanarApp(...)
app.register_model_factory("vertex_gemini", vertex_gemini_model_factory)

Factories receive the raw options dict plus a PlanarConfig reference, so you can inject per-environment parameters without touching workflow code. They can be synchronous callables or async coroutines—Planar automatically handles awaiting them and caching the resulting model instances.

Set provider credentials through environment variables (e.g., OPENAI_API_KEY for the OpenAI entries above). For more configuration patterns and workflow design guidance, browse the documents in docs/.

Examples

  • examples/expense_approval_workflow — human approvals with AI agent collaboration
  • examples/event_based_workflow — event-driven orchestration and external wakeups
  • examples/simple_service — CRUD service paired with workflows

Run any example with uv run planar dev path/to/main.py.

Testing

For testing your workflows, you can use the planar.testing module. This module provides a PlanarTestClient class that can be used to test your Planar application.

Be sure to add the planar.testing.fixtures pytest plugin to your pyproject.toml file.

[project.entry-points.pytest11]
planar = "planar.testing.fixtures"

For more information, see docs/testing_workflows.md.

Local Development

Planar is built with uv. Clone the repository and install dev dependencies:

uv sync --extra otel

Useful commands:

  • uv run ruff check --fix and uv run ruff format to lint and format
  • uv run pyright for static type checking
  • uv run pytest to run the test suite (use -n auto for parallel execution)
  • uv run pytest --cov=planar to collect coverage
  • uv tool install pre-commit && uv tool run pre-commit install to enable git hooks

PostgreSQL Test Suite

docker run --restart=always --name planar-postgres \
  -e POSTGRES_PASSWORD=postgres \
  -p 127.0.0.1:5432:5432 \
  -d docker.io/library/postgres

PLANAR_TEST_POSTGRESQL=1 PLANAR_TEST_POSTGRESQL_CONTAINER=planar-postgres \
  uv run pytest -s

Disable SQLite with PLANAR_TEST_SQLITE=0.

Cairo SVG Dependencies

Some AI integration tests convert SVG assets using cairosvg. Install Cairo libraries locally before running those tests:

brew install cairo libffi pkg-config
export DYLD_FALLBACK_LIBRARY_PATH="/opt/homebrew/lib:${DYLD_FALLBACK_LIBRARY_PATH}"

Most Linux distributions ship the required libraries via their package manager.

Documentation

Use uv run planar docs to view the documentation in your terminal - this is particularly useful to equip coding agents with context about Planar. Alternatively, use docs/llm_prompt.md as a drop-in reference document in whatever tool you are using.

Dive deeper into Planar's design and APIs in the docs/ directory:

  • docs/workflows.md
  • docs/agents.md
  • docs/design/event_based_waiting.md
  • docs/design/human_step.md

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

planar-0.23.2-py3-none-any.whl (400.8 kB view details)

Uploaded Python 3

File details

Details for the file planar-0.23.2-py3-none-any.whl.

File metadata

  • Download URL: planar-0.23.2-py3-none-any.whl
  • Upload date:
  • Size: 400.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for planar-0.23.2-py3-none-any.whl
Algorithm Hash digest
SHA256 da50df847bc36c68808e0a90ef220101f94745f743551853235defb4ba394d31
MD5 647b1dc6efa88b36eb1974703d92bd7c
BLAKE2b-256 023040fa420d6e28490a281b2435e3af5f3cfaf0b9a03c5c2f26907eba02b59b

See more details on using hashes here.

Provenance

The following attestation bundles were made for planar-0.23.2-py3-none-any.whl:

Publisher: create-release.yml on coplane/planar

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page