Skip to main content

LLM-first conversational ETL pipeline generator

Project description

Osiris Pipeline v0.5.3

The deterministic compiler for AI-native data pipelines. You describe outcomes in plain English; Osiris compiles them into reproducible, production-ready manifests that run with the same behavior everywhere (local or cloud).

🚀 Quick Start

# Setup
python3 -m venv .venv
source .venv/bin/activate
pip install -r requirements.txt

# Initialize configuration
osiris init

# Start MCP server for AI integration (Claude Desktop, etc.)
osiris mcp

🎯 What Makes Osiris Different

  • Compiler, not orchestrator - Others schedule what you hand-craft. Osiris generates, validates, and compiles pipelines from plain English.
  • Determinism as a contract - Fingerprinted manifests guarantee reproducibility across environments.
  • Conversational → executable - Describe intent; Osiris interrogates real systems and proposes a feasible plan.
  • Run anywhere, same results - Transparent adapters deliver execution parity (local and E2B today).
  • Boring by design - Predictable, explainable, portable — industrial-grade AI, not magical fragility.

📊 Visual Overview

Pipeline Execution Dashboard

Osiris Dashboard Interactive HTML dashboard showing pipeline execution metrics and performance

Run Overview with E2B Integration

Run Overview Comprehensive run overview showing E2B cloud execution with <1% overhead

Step-by-Step Pipeline Execution

Pipeline Steps Detailed view of pipeline steps with row counts and execution times

Example Usage via MCP

# Start the MCP server
$ osiris mcp

# Use with Claude Desktop or any MCP-compatible client to:
# - Discover database schemas and sample data
# - Generate SQL queries and transformations
# - Validate and compile pipelines
# - Execute with deterministic, reproducible results

# Or run pipelines directly:
$ osiris run examples/inactive_customers.yaml

✨ Key Features

  • AI-native pipeline generation from plain English descriptions
  • Deterministic compilation with fingerprinted, reproducible manifests
  • Run anywhere with identical behavior (local or E2B cloud)
  • Interactive HTML reports with comprehensive observability
  • AI Operation Package (AIOP) for LLM-friendly debugging and analysis
  • LLM-friendly with machine-readable documentation for AI assistants

🤖 LLM-Friendly Documentation

Osiris provides machine-readable documentation for AI assistants:

🚀 E2B Cloud Execution

Run pipelines in isolated E2B sandboxes with <1% overhead:

# Run in cloud sandbox
osiris run pipeline.yaml --e2b

# With custom resources
osiris run pipeline.yaml --e2b --e2b-cpu 4 --e2b-mem 8

See the User Guide for complete E2B documentation.

🤖 AI Operation Package (AIOP)

Every pipeline run automatically generates a comprehensive AI Operation Package for LLM analysis:

# View AIOP export after any run
osiris logs aiop --last

# Generate human-readable summary
osiris logs aiop --last --format md

# Configure in osiris.yaml
aiop:
  enabled: true  # Auto-export after each run
  policy: core   # ≤300KB for LLM consumption

AIOP provides four semantic layers for AI understanding:

  • Evidence Layer: Timestamped events, metrics, and artifacts
  • Semantic Layer: DAG structure and component relationships
  • Narrative Layer: Natural language descriptions with citations
  • Metadata Layer: LLM primer and configuration

See AIOP Architecture for details.

📚 Documentation

For comprehensive documentation, visit the Documentation Hub:

🚦 Roadmap

  • v0.2.0 ✅ - Conversational agent, deterministic compiler, E2B parity
  • v0.3.0 ✅ - AI Operation Package (AIOP) for LLM-friendly debugging
  • v0.3.1 ✅ - Fixed validation warnings for ADR-0020 compliant configs
  • v0.3.5 ✅ - GraphQL extractor, DuckDB processor, test infrastructure improvements
  • v0.5.3 (Current) ✅ - Python version requirement fix + CSV extractor runtime bug fix
  • M2 - Production workflows, approvals, orchestrator integration
  • M3 - Streaming, parallelism, enterprise scale
  • M4 - Iceberg tables, intelligent DWH agent

See docs/roadmap/ for details.

🛠️ Contributing

See CONTRIBUTING.md for development workflow, code quality standards, and commit guidelines.

License

Apache-2.0

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

osiris_pipeline-0.5.3.tar.gz (440.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

osiris_pipeline-0.5.3-py3-none-any.whl (486.2 kB view details)

Uploaded Python 3

File details

Details for the file osiris_pipeline-0.5.3.tar.gz.

File metadata

  • Download URL: osiris_pipeline-0.5.3.tar.gz
  • Upload date:
  • Size: 440.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.9

File hashes

Hashes for osiris_pipeline-0.5.3.tar.gz
Algorithm Hash digest
SHA256 329dd20b222dd0020213d0b42b36be959b96999840b8506b2b70d9589df4b69c
MD5 bedd875a998a1dd9bd83523fc22c7f6f
BLAKE2b-256 318157017c415320a4bfcf3db01f87ada4f27f0c80b7f40147ce8f613cdc1025

See more details on using hashes here.

File details

Details for the file osiris_pipeline-0.5.3-py3-none-any.whl.

File metadata

File hashes

Hashes for osiris_pipeline-0.5.3-py3-none-any.whl
Algorithm Hash digest
SHA256 aabbbefce84b82bf6025dcc50890bce4c612e94be802b6cb100b3ab57a8916b0
MD5 d43f607854dbf7b519b2290afa838874
BLAKE2b-256 9f392c8c5696475098f63958dd32f6e14a37be03008697cd6dd1b1b568f29e4b

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page