Build prompt pipelines easily.
Project description
Conflux
A simple Python library to build prompt pipelines and applications with Large Language Models (LLMs). Conflux is designed for flexibility, composability, and ease of use, making it easy to create complex LLM workflows.
Key Features:
- Build modular, composable LLM pipelines with minimal code
- Integrate with OpenAI, Gemini, FAISS, and more
- Full control over prompt engineering and execution
- Intuitive, Pythonic API for rapid prototyping and production
Use Cases:
- Prompt chaining and orchestration
- Retrieval-augmented generation (RAG)
- Tool-calling and agent workflows
- Custom LLM-powered applications
Core Concept
Note: The following diagram uses Mermaid syntax. If your Markdown viewer does not support Mermaid, please refer to the docs for a static image.
graph LR
subgraph "HandlerChain"
A[Handler 1] -->|Message| B[Handler 2]
B -->|Message| C[Handler 3]
end
Input["Input (Message)"] --> A
C --> Output["Output (Message)"]
A HandlerChain is a sequence of handlers that process messages step by step. This enables you to build complex LLM workflows by composing simple, reusable components.
Conflux has three main components:
- Messages: Entities in an application communicate through
Messageobjects. - Handlers:
Messages are passed throughHandlers that can modify, transform, or format the message. - HandlerChains:
Handlers are chained together to form aHandlerChain, which executes handlers in order.
Installation
Requirements:
- Python 3.12+
- Windows, macOS, or Linux
Install the core package:
pip install conflux-ai
Or, if you use FAISS for similarity search:
pip install -U conflux-ai[faiss]
Example Usage
Below is a simple example that generates a company name and tagline using OpenAI's LLM. (Requires an OpenAI API key set as the OPENAI_API_KEY environment variable.) You can also replace OpenAiLLM with GeminiLLM to use Google's Gemini LLM using GOOGLE_API_KEY.
import asyncio
from conflux import HandlerChain, Message, handler
from conflux.handlers import OpenAiLLM
@handler
async def company_name(msg: Message, chain: HandlerChain) -> str:
chain.variables["product"] = msg.primary
return (
f"What would be an appropriate name for a business specializing in {msg.primary}?"
"Only mention the company name and nothing else."
)
@handler
async def company_tagline(msg: Message, chain: HandlerChain) -> str:
return (
f"What would be an appropriate tagline for a business specializing in {chain.variables['product']}"
f" and with company name {msg.primary}?\nFormat your output in the following"
f" format:\n{msg.primary}: <tagline>"
)
def main():
name_and_tagline_generator = (
company_name >> OpenAiLLM() >> company_tagline >> OpenAiLLM()
)
res = asyncio.run(name_and_tagline_generator("bike"))
print(res)
if __name__ == "__main__":
main() # Example output: Socket: The best socks in the world
Advanced Examples
Explore more advanced usage patterns and integrations in the examples/ directory:
- MCP Tool Call Example: How to call tools from an MCP (Model Context Protocol) server as part of your handler chain.
- Retrieval-Augmented Generation (RAG) Example: How to build a RAG pipeline using OpenAI embeddings and a FAISS vector index.
For more, see the examples folder in the repository.
Why Conflux?
Applications with LLMs can get complex quickly. Conflux is designed for simplicity and scalability, giving you control over prompts and execution while maintaining an intuitive API. Unlike more rigid frameworks, Conflux lets you customize every step of your pipeline.
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file conflux_ai-0.7.2.tar.gz.
File metadata
- Download URL: conflux_ai-0.7.2.tar.gz
- Upload date:
- Size: 3.0 MB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: uv/0.7.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
1a7d5831c7280dca646f2f270372c04af8633c68a0a79b4ec741c44f70ae1b2b
|
|
| MD5 |
f22f97c147b42a06db3197d4505ea6f8
|
|
| BLAKE2b-256 |
c97a067c4d6544d6fc3a0c7d5560567153b0dfa5814c720124cb00d45bbd0fff
|
File details
Details for the file conflux_ai-0.7.2-py3-none-any.whl.
File metadata
- Download URL: conflux_ai-0.7.2-py3-none-any.whl
- Upload date:
- Size: 15.8 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: uv/0.7.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
c52e8ddd690584f596806f5719c372b043eb925e454cc0d84e6a0ad832a533ba
|
|
| MD5 |
1a9980afb0c7be8e354f5c6cdd572f4f
|
|
| BLAKE2b-256 |
cb10e147cd0d76b32159789acde18e5581ebcd1c51b238c781ebb558d862fcea
|