Skip to main content

Modular agentic pipeline system for document and data workflows

Project description

Trellis

Modular agentic pipeline system for document and data workflows.

Trellis lets you define multi-step AI pipelines as declarative YAML — ingest PDFs, fetch SEC filings, call LLMs, extract structured fields, fan out over lists — and run them from the CLI, a REST API, or directly from Python.

Documentation · Installation · Pipeline DSL Reference · Examples


Install

pip install trellis-pipelines

With uv:

uv add trellis-pipelines

Requires Python 3.12+. Set at least one LLM provider key before running pipelines:

export ANTHROPIC_API_KEY=sk-ant-...   # or OPENAI_API_KEY, etc.

Quickstart

CLI

# Validate a pipeline file
trellis validate examples/pipelines/pdf_summarize.yaml

# Run a pipeline
trellis run examples/pipelines/pdf_summarize.yaml

# Pass runtime parameters
trellis run examples/pipelines/fetch_10k_parametrized.yaml \
  --params '{"ticker": "AAPL", "year": "2024"}'

# Override the LLM model for all tasks
trellis run examples/pipelines/pdf_summarize.yaml --model anthropic/claude-haiku-4-5

A pipeline file

pipeline:
  id: pdf_summarize
  goal: "Load a PDF and summarize key points"
  tasks:
    - id: ingest_pdf
      tool: ingest_document
      inputs:
        path: "https://example.com/report.pdf"

    - id: extract_content
      tool: extract_from_texts
      inputs:
        document: "{{ingest_pdf.output}}"
        prompt: "Extract the main topics and key metrics"

    - id: summarize
      tool: llm_job
      inputs:
        prompt: |
          Summarize in 5 bullet points for a busy executive:
          {{extract_content.output}}
        max_tokens: 256

Dependencies are inferred from {{task_id.output}} references — no explicit wiring needed.

Python

import asyncio
from trellis.models.pipeline import PipelineSpec
from trellis.execution.orchestrator import Orchestrator

spec = PipelineSpec.from_yaml("examples/pipelines/pdf_summarize.yaml")
orchestrator = Orchestrator()
result = asyncio.run(orchestrator.run_pipeline(spec))

print(result.outputs)

REST API

# Start the server (requires trellis-pipelines[api])
pip install "trellis-pipelines[api]"
uvicorn trellis_api.main:app --reload
curl -X POST http://localhost:8000/pipelines/run \
  -H "Content-Type: application/json" \
  -d '{
    "pipeline": {
      "id": "hello",
      "goal": "Say hello",
      "tasks": [
        {
          "id": "greet",
          "tool": "llm_job",
          "inputs": { "prompt": "Say hello in one sentence." }
        }
      ]
    }
  }'

Key features

  • Declarative YAML DSL — flat task list, dependencies inferred from template references
  • Template resolution{{task_id.output}}, {{params.key}}, {{session.key}}, {{item}} (fan-out)
  • Fan-out / parallel_over — scatter a task over a list, collect results automatically
  • await barriers — explicit synchronization across parallel branches
  • Retry & backoff — per-task retry with exponential backoff and jitter
  • Structured extractionextract_fields with JSON Schema, extract_from_texts for free-form
  • PDF + web ingestioningest_document (PDF/HTML), fetch_url, search_web
  • Multi-tenancy — tenant-scoped blackboard (store / {{session.key}}) for stateful workflows
  • CLI, REST API, and Python SDK — three interfaces, one engine

Project structure

trellis/            # Core: models, execution engine, tool registry
trellis_api/        # FastAPI REST server (optional extra: [api])
trellis_cli/        # Typer CLI
trellis_mcp/        # MCP server adapter (roadmap)
examples/           # Example pipelines and data
docs/               # MkDocs source
tests/              # Test suite

Optional extras

Extra What it adds
trellis-pipelines[api] FastAPI + uvicorn REST server
trellis-pipelines[dev] pytest, ruff, mypy, black, isort
trellis-pipelines[all] All extras

Documentation


License

MIT — see LICENSE or https://opensource.org/license/mit.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

trellis_pipelines-0.3.0.tar.gz (122.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

trellis_pipelines-0.3.0-py3-none-any.whl (144.7 kB view details)

Uploaded Python 3

File details

Details for the file trellis_pipelines-0.3.0.tar.gz.

File metadata

  • Download URL: trellis_pipelines-0.3.0.tar.gz
  • Upload date:
  • Size: 122.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.10

File hashes

Hashes for trellis_pipelines-0.3.0.tar.gz
Algorithm Hash digest
SHA256 618c654f60c537fce7e9a2a4c00498859d471e93420e877706d30e7a3aebf9dd
MD5 7a1814616a51889b6dd5340763c82ddb
BLAKE2b-256 4fa691b0615ecd04b1b068a96e9d6fceb4d784b2c98aa183132185ed6bc038a4

See more details on using hashes here.

File details

Details for the file trellis_pipelines-0.3.0-py3-none-any.whl.

File metadata

File hashes

Hashes for trellis_pipelines-0.3.0-py3-none-any.whl
Algorithm Hash digest
SHA256 cb40137c462ef7fa0a18fe17bbe2b985253944ad59f6c623f3c7df70273cea40
MD5 81c1d6783b216bd4de392a407e6fe319
BLAKE2b-256 1789512086e9693271d8dba6b70bf6b521d90f1588e4eeee1543117ab0ee2611

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page