Skip to main content

Pipelex is an open-source dev tool based on a simple declarative language that lets you define replicable, structured, composable LLM pipelines.

Project description

Pipelex Logo

Open-source language for repeatable AI workflows

Pipelex is an open-source devtool that transforms how you build repeatable AI workflows. Think of it as Docker or SQL for AI operations.

Create modular "pipes", each using a different LLM and guaranteeing structured outputs. Connect them like LEGO blocks sequentially, in parallel, or conditionally, to build complex knowledge transformations from simple, reusable components.

Stop reinventing AI workflows from scratch. With Pipelex, your proven methods become shareable, versioned artifacts that work across different LLMs. What took weeks to perfect can now be forked, adapted, and scaled instantly.


MIT License PyPI – latest release

Discord YouTube Website Cookbook Documentation Changelog

📜 The Knowledge Pipeline Manifesto

Read why we built Pipelex to transform unreliable AI workflows into deterministic pipelines 🔗

🚀 See Pipelex in Action

Pipelex Demo

📑 Table of Contents

Introduction

Pipelex makes it easy for developers to define and run repeatable AI workflows. At its core is a clear, declarative pipeline language specifically crafted for knowledge-processing tasks.

Build pipelines from modular pipes that snap together. Each pipe can use a different language model (LLM) or software to process knowledge. Pipes consistently deliver structured, predictable outputs at each stage.

Pipelex uses TOML syntax, making workflows readable and shareable. Business professionals, developers, and AI coding agents can all understand and modify the same pipeline definitions.

Example:

[concept]
Buyer = "The person who made the purchase"
PurchaseDocumentText = "Transcript of a receipt, invoice, or order confirmation"

[pipe.extract_buyer]
PipeLLM = "Extract buyer from purchase document"
inputs = { purchase_document_text = "PurchaseDocumentText" }
output = "Buyer"
llm = "llm_to_extract_info"
prompt_template = """
Extract the first and last name of the buyer from this purchase document:
@purchase_document_text
"""

Pipes are modular building blocks that connect sequentially, run in parallel, or call sub-pipes. Like function calls in traditional programming, but with a clear contract: knowledge-in, knowledge-out. This modularity makes pipelines perfect for sharing: fork someone's invoice processor, adapt it for receipts, share it back.

Pipelex is an open-source Python library with a hosted API launching soon. It integrates seamlessly into existing systems and automation frameworks. Plus, it works as an MCP server so AI agents can use pipelines as tools.

🚀 Quick start

:books: Note that you can check out the Pipelex Documentation for more information and clone the Pipelex Cookbook repository for ready-to-run samples.

Follow these steps to get started:

Installation

Prerequisites

Option #1: Run examples

Visit the GitHub: you can clone it, fork it, play with it

Option #2: Install the package

# Using pip
pip install pipelex

# Using Poetry
poetry add pipelex

# Using uv (Recommended)
uv pip install pipelex

IDE extension

We highly recommend installing an extension for TOML files into your IDE of choice. For VS Code, the Even Better TOML extension does a great job of syntax coloring and checking.

Optional Features

The package supports the following additional features:

  • anthropic: Anthropic/Claude support
  • google: Google models (Vertex) support
  • mistralai: Mistral AI support
  • bedrock: AWS Bedrock support
  • fal: Image generation with Black Forest Labs "FAL" service

Install all extras:

Using pip:

pip install "pipelex[anthropic,google,mistralai,bedrock,fal]"

Using poetry:

poetry add "pipelex[anthropic,google,mistralai,bedrock,fal]"

Using uv:

uv pip install "pipelex[anthropic,google,mistralai,bedrock,fal]"

Example: optimizing a tweet in 2 steps

1. Define the pipeline in TOML

domain = "tech_tweet"
definition = "A pipeline for optimizing tech tweets using Twitter/X best practices"

[concept]
DraftTweet = "A draft version of a tech tweet that needs optimization"
OptimizedTweet = "A tweet optimized for Twitter/X engagement following best practices"
TweetAnalysis = "Analysis of the tweet's structure and potential improvements"
WritingStyle = "A style of writing"

[pipe]
[pipe.analyze_tweet]
PipeLLM = "Analyze the draft tweet and identify areas for improvement"
inputs = { draft_tweet = "DraftTweet" }
output = "TweetAnalysis"
llm = "llm_for_writing_analysis"
system_prompt = """
You are an expert in social media optimization, particularly for tech content on Twitter/X.
Your role is to analyze tech tweets and check if they display typical startup communication pitfalls.
"""
prompt_template = """
Evaluate the tweet for these key issues:

**Fluffiness** - Overuse of buzzwords without concrete meaning (e.g., "synergizing disruptive paradigms")

**Cringiness** - Content that induces secondhand embarrassment (overly enthusiastic, trying too hard to be cool, excessive emoji use)

**Humblebragginess** - Disguising boasts as casual updates or false modesty ("just happened to close our $ 10M round 🤷")

**Vagueness** - Failing to clearly communicate what the product/service actually does

For each criterion, provide:
1. A score (1-5) where 1 = not present, 5 = severely present
2. If the problem is not present, no comment. Otherwise, explain of the issue and give concise guidance on fixing it, without providing an actual rewrite

@draft_tweet

"""

[pipe.optimize_tweet]
PipeLLM = "Optimize the tweet based on the analysis"
inputs = { draft_tweet = "DraftTweet", tweet_analysis = "TweetAnalysis", writing_style = "WritingStyle" }
output = "OptimizedTweet"
llm = "llm_for_social_post_writing"
system_prompt = """
You are an expert in writing engaging tech tweets that drive meaningful discussions and engagement.
Your goal is to rewrite tweets to be impactful and avoid the pitfalls identified in the analysis.
"""
prompt_template = """
Rewrite this tech tweet to be more engaging and effective, based on the analysis:

Original tweet:
@draft_tweet

Analysis:
@tweet_analysis

Requirements:
- Include a clear call-to-action
- Make it engaging and shareable
- Use clear, concise language

### Reference style example

@writing_style

### Additional style instructions

No hashtags.
Minimal emojis.
Keep the core meaning of the original tweet.
"""

[pipe.optimize_tweet_sequence]
PipeSequence = "Analyze and optimize a tech tweet in sequence"
inputs = { draft_tweet = "DraftTweet", writing_style = "WritingStyle" }
output = "OptimizedTweet"
steps = [
    { pipe = "analyze_tweet", result = "tweet_analysis" },
    { pipe = "optimize_tweet", result = "optimized_tweet" },
]

2. Run the pipeline

Here is the flowchart generated during this run:

---
config:
  layout: dagre
  theme: base
---
flowchart LR
    subgraph "optimize_tweet_sequence"
    direction LR
        FGunn["draft_tweet:<br>**Draft tweet**"]
        EWhtJ["tweet_analysis:<br>**Tweet analysis**"]
        65Eb2["optimized_tweet:<br>**Optimized tweet**"]
        i34D5["writing_style:<br>**Writing style**"]
    end
class optimize_tweet_sequence sub_a;

    classDef sub_a fill:#e6f5ff,color:#333,stroke:#333;

    classDef sub_b fill:#fff5f7,color:#333,stroke:#333;

    classDef sub_c fill:#f0fff0,color:#333,stroke:#333;
    FGunn -- "Analyze tweet" ----> EWhtJ
    FGunn -- "Optimize tweet" ----> 65Eb2
    EWhtJ -- "Optimize tweet" ----> 65Eb2
    i34D5 -- "Optimize tweet" ----> 65Eb2

3. wait… no, there is no step 3, you're done!


🤝 Contributing

We welcome contributions! Please see our Contributing Guidelines for details on how to get started, including development setup and testing information.

👥 Join the Community

Join our vibrant Discord community to connect with other developers, share your experiences, and get help with your Pipelex projects!

Discord

💬 Support

  • GitHub Issues: For bug reports and feature requests
  • Discussions: For questions and community discussions
  • Documentation

⭐ Star Us!

If you find Pipelex helpful, please consider giving us a star! It helps us reach more developers and continue improving the tool.

📝 License

This project is licensed under the MIT license. Runtime dependencies are distributed under their own licenses via PyPI.


"Pipelex" is a trademark of Evotis S.A.S.

© 2025 Evotis S.A.S.

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pipelex-0.6.0.tar.gz (185.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

pipelex-0.6.0-py3-none-any.whl (305.2 kB view details)

Uploaded Python 3

File details

Details for the file pipelex-0.6.0.tar.gz.

File metadata

  • Download URL: pipelex-0.6.0.tar.gz
  • Upload date:
  • Size: 185.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.12.9

File hashes

Hashes for pipelex-0.6.0.tar.gz
Algorithm Hash digest
SHA256 966e9e2c60350e225b803fdbc1e8baca0ccbb2109e4c28355f26e0fafd2932c7
MD5 b48d158d8e7df087b296caae364dfb4b
BLAKE2b-256 aa6bdbbcfae9b3f0df1bde7570b1a55e324a8e3b2fbd2ec4ae30e4c58b09d67b

See more details on using hashes here.

Provenance

The following attestation bundles were made for pipelex-0.6.0.tar.gz:

Publisher: publish-pypi.yml on Pipelex/pipelex

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file pipelex-0.6.0-py3-none-any.whl.

File metadata

  • Download URL: pipelex-0.6.0-py3-none-any.whl
  • Upload date:
  • Size: 305.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.12.9

File hashes

Hashes for pipelex-0.6.0-py3-none-any.whl
Algorithm Hash digest
SHA256 4a04172a41a47d6672b448ea2fd40cbd7f171adeeb71088fa313fdb6e4b20437
MD5 e2ec3804a4a4463945c920db71161ab8
BLAKE2b-256 c6cad874d9d6d29e0d871528e85dee1c47deee986bb9991ac6622b8a77921605

See more details on using hashes here.

Provenance

The following attestation bundles were made for pipelex-0.6.0-py3-none-any.whl:

Publisher: publish-pypi.yml on Pipelex/pipelex

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page