Skip to main content

Compile plain-English SOPs into executable, cost-optimized agent graphs

Project description

soplex

Compile plain-English SOPs into executable, cost-optimized agent graphs

Transform Standard Operating Procedures into hybrid agent graphs where conversation steps use LLMs, decision steps run as deterministic code, and tool/API steps execute as function calls. The result: 77% cheaper than pure-LLM agents with 99%+ accuracy on branching decisions.

🚀 Key Features

  • Hybrid execution: LLM for conversation, code for logic, APIs for actions
  • Multi-provider support: OpenAI, Anthropic, Google Gemini, Ollama, LiteLLM, or any OpenAI-compatible endpoint
  • Cost optimization: Dramatically reduce LLM costs by running decisions as code
  • High accuracy: Deterministic branching logic eliminates LLM reasoning errors
  • Production ready: Comprehensive testing, type safety, and security best practices

📦 Installation

pip install soplex-ai

# Optional providers
pip install soplex-ai[anthropic]    # Anthropic Claude
pip install soplex-ai[litellm]      # LiteLLM
pip install soplex-ai[all]          # All providers

🔧 Quick Start

1. Create a SOP file

PROCEDURE: Customer Refund Request
TRIGGER: Customer requests refund for order
TOOLS: order_db, payments_api, identity_check

1. Greet the customer and ask for their order number
2. Lookup the order details in order_db using the provided order number
3. Check if the order was placed within the last 30 days
   - YES: Proceed to step 4
   - NO: Inform customer that refunds are only available for orders within 30 days and end
4. Verify customer identity using identity_check with order email
5. Ask customer for the reason for the refund
6. Process the refund using payments_api
7. Confirm with customer that refund has been processed

2. Analyze the SOP

soplex analyze refund.sop

Output shows step classification and cost estimates:

📊 SOP Analysis: Customer Refund Request

Step Classification:
🧠 LLM Steps:    4 (conversation)
⚡ CODE Steps:   2 (deterministic logic)
🔀 BRANCH Steps: 1 (conditional)

💰 Cost Estimate:
Pure LLM:    $0.0084
Hybrid:      $0.0019  (77% savings)

3. Compile and run

# Compile SOP to executable graph
soplex compile refund.sop --output ./compiled/

# Interactive chat with the agent
soplex chat ./compiled/refund.json

🚀 Programmatic Python API

For engineers who prefer native Python code over parsing text files, Soplex provides a PythonGraphBuilder. This bypasses LLM code synthesis entirely, allowing robust deterministic routing logic.

from soplex import PythonGraphBuilder

builder = PythonGraphBuilder(name="Native Support Flow")

# 1. Add LLM interaction
builder.add_llm_step(id="start", action="Greet customer")

# 2. Add specific explicit Python logic
def process_data(state):
    state["verified"] = True
    return state
    
builder.add_code_step(
    id="verify", 
    action="Verify data", 
    handler_func=process_data
)

# 3. Add a branch decision point
builder.add_branch_step(id="check", action="Check verified")
builder.add_end_step(id="success", action="Finish")
builder.add_end_step(id="fail", action="Fail")

# 4. Wire edges with strict Python conditions
builder.add_edge("start", "verify")
builder.add_edge("verify", "check")
builder.add_edge("check", "success", condition_func=lambda s: s.get("verified"))
builder.add_edge("check", "fail", condition_func=lambda s: not s.get("verified"))

graph = builder.build()

🎯 Step Types

soplex automatically classifies each step based on keywords:

Type Keywords Execution Example
LLM ask, greet, inform, confirm, explain Conversational AI "Greet the customer warmly"
CODE check, lookup, calculate, verify, process Deterministic logic "Check if order was placed within 30 days"
HYBRID Mixed LLM + CODE keywords LLM + validation "Ask customer for order number and verify it"
BRANCH if, when, check:, conditional patterns Conditional logic "Check: Is the payment successful?"
END end, complete, done, finish Terminal "End the process successfully"
ESCALATE escalate, hand off, transfer Human handoff "Escalate to supervisor"

⚙️ Configuration

Configure via environment variables (.env) or CLI flags:

# .env file
OPENAI_API_KEY=sk-...
SOPLEX_PROVIDER=openai
SOPLEX_MODEL=gpt-4o-mini
SOPLEX_TEMPERATURE=0.3

Supported providers:

  • openai - OpenAI GPT models
  • anthropic - Anthropic Claude models
  • gemini - Google Gemini models
  • ollama - Local Ollama models
  • litellm - Any LiteLLM-supported provider
  • custom - Custom OpenAI-compatible endpoint

📊 CLI Commands

# Analyze SOP structure and costs
soplex analyze refund.sop --provider anthropic --model claude-sonnet-4-20250514

# Compile SOP to executable graph
soplex compile refund.sop --output ./compiled/

# Interactive agent chat
soplex chat ./compiled/refund.json

# Generate flowchart visualization
soplex visualize ./compiled/refund.json --output refund.svg

# Run test scenarios
soplex test ./compiled/refund.json --scenarios test_cases.yaml

# View execution statistics
soplex stats

🏗️ Architecture

Plain Text SOP → Parser → Classifier → Graph Builder → Executor
                    ↓         ↓            ↓           ↓
                 Structure  LLM/CODE    Execution    Runtime
                            Types       Graph
  • Parser: Converts plain text to structured data
  • Classifier: Determines execution type (LLM/CODE/HYBRID) via keywords
  • Graph Builder: Creates executable node graph with conditional edges
  • Executor: Runs graph step-by-step, calling LLM only when needed

🧪 Testing

# Run all tests (without API calls)
pytest tests/ -v

# Run with real API integration tests
pytest tests/test_e2e.py -v -m e2e

# Run specific test categories
pytest tests/test_parser.py -v
pytest tests/test_classifier.py -v

🔐 Security

  • Environment variables loaded securely via python-dotenv
  • API keys never logged or exposed in output
  • Production-grade error handling and validation
  • Comprehensive input sanitization

📈 Cost Savings

Traditional pure-LLM agents call the LLM for every step. soplex only calls LLMs for conversation, running logic and decisions as code:

Traditional:  🧠🧠🧠🧠🧠🧠🧠  (7 LLM calls)
soplex:       🧠⚡🧠⚡⚡🧠⚡  (3 LLM calls, 4 code calls)
Savings:      ~57-77% cost reduction

🤝 Contributing

git clone https://github.com/pratikbhande/soplex
cd soplex
python -m venv venv
source venv/bin/activate  # or venv\Scripts\activate on Windows
pip install -e ".[dev]"
pytest

📄 License

MIT License - see LICENSE file for details.

🔗 Links

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

soplex_ai-0.1.5.tar.gz (56.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

soplex_ai-0.1.5-py3-none-any.whl (58.8 kB view details)

Uploaded Python 3

File details

Details for the file soplex_ai-0.1.5.tar.gz.

File metadata

  • Download URL: soplex_ai-0.1.5.tar.gz
  • Upload date:
  • Size: 56.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.3

File hashes

Hashes for soplex_ai-0.1.5.tar.gz
Algorithm Hash digest
SHA256 604bbeb65c3f1c97934618801a69f9677145cab878eb611dc13b0a3d016062cc
MD5 e88f38e94536a88abc95d82d33da6193
BLAKE2b-256 c980b2f56899c30f7f62b57bc9dac1c83702813402fbea9376afea2ac75b58ff

See more details on using hashes here.

File details

Details for the file soplex_ai-0.1.5-py3-none-any.whl.

File metadata

  • Download URL: soplex_ai-0.1.5-py3-none-any.whl
  • Upload date:
  • Size: 58.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.3

File hashes

Hashes for soplex_ai-0.1.5-py3-none-any.whl
Algorithm Hash digest
SHA256 68784e2e1b8ab95bd917c10346966b75261867daf337a766c30d47bdb66ad3cd
MD5 06f0fd955cbdbdd7b81931891643a241
BLAKE2b-256 c37b24d829ea968e774caea8994352ce1b09fe3cbd5f6edb41784c3502fae7df

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page