Skip to main content

API Optimization Framework powered by evolutionary algorithms, multi-armed bandits, and agent societies

Project description

The Convergence

API Optimization Framework powered by evolutionary algorithms, multi-armed bandits, and agent societies

Python 3.11+ License Version

The Convergence automatically finds optimal API configurations through intelligent experimentation. Instead of manually tuning parameters (model, temperature, tokens, etc.), it runs automated experiments, evaluates results, and evolves better configurations over multiple generations.

🚀 Quick Start

Installation

pip install the-convergence

2-Minute Example

# Interactive setup wizard
convergence init

# Run optimization
convergence optimize optimization.yaml

Or use the SDK:

from convergence import run_optimization
from convergence.types import ConvergenceConfig, ApiConfig, SearchSpaceConfig

config = ConvergenceConfig(
    api=ApiConfig(name="my_api", endpoint="https://api.example.com/v1/chat"),
    search_space=SearchSpaceConfig(parameters={
        "temperature": {"type": "float", "min": 0.1, "max": 1.5},
        "model": {"type": "categorical", "choices": ["gpt-4o-mini", "gpt-4o"]}
    }),
    evaluation=EvaluationConfig(required_metrics=["quality"], weights={"quality": 1.0}),
    runner=RunnerConfig(generations=10, population=20)
)

result = await run_optimization(config)
print(f"Best config: {result['best_config']}")
print(f"Best score: {result['best_score']}")

🎯 What It Does

The Convergence optimizes API parameters to maximize performance metrics:

  • Quality - Response quality (LLM judge, similarity, exact match)
  • Latency - Response time (milliseconds)
  • Cost - Price per API call (USD)

Example: Find the best temperature and model combination that maximizes quality while minimizing cost.

🧬 How It Works

The Convergence combines three optimization strategies:

  1. Multi-Armed Bandits (MAB) - Intelligent exploration vs exploitation

    • Thompson Sampling balances trying new configs vs exploiting known good ones
    • Bayesian probability guides selection
  2. Evolutionary Algorithms - Genetic mutation and crossover

    • Mutation: Random parameter changes
    • Crossover: Combine two successful configs
    • Selection: Keep top performers (elitism)
  3. Reinforcement Learning (RL) - Meta-learning from history

    • Learns which parameter ranges work best
    • Adjusts evolution parameters dynamically
    • Hierarchical learning across runs

Optional: Agent Society (RLP + SAO) for advanced reasoning and self-improvement.

📖 Documentation

🏗️ Architecture

Core Components

  • Optimization Engine - Coordinates MAB, Evolution, RL
  • API Caller - Makes HTTP requests with config parameters
  • Evaluator - Scores responses against test cases
  • Storage - Persists results (SQLite, File, Convex, Memory)
  • Adapters - Provider-specific request/response transformations

Entry Points

  1. CLI - convergence optimize config.yaml
  2. SDK - from convergence import run_optimization
  3. Runtime - Per-request bandit selection for production

🔧 Features

Optimization Modes

  • Batch Optimization - Full optimization runs (CLI/SDK)
  • Runtime Selection - Per-request config selection (production)
  • Continuous Evolution - Arms evolve during production use

Evaluation

  • Custom Evaluators - Write Python functions for domain-specific scoring
  • Built-in Metrics - Quality, latency, cost, exact match, similarity
  • Multi-Objective - Optimize multiple metrics simultaneously

Storage

  • Multi-Backend - SQLite (default), File, Convex, Memory
  • Legacy System - Tracks optimization history across runs
  • Warm-Start - Resume from previous winners

Provider Support

  • LLM APIs - OpenAI, Azure OpenAI, Groq, Google Gemini
  • Web Automation - BrowserBase
  • Agno Agents - Discord, Gmail, Reddit agents
  • Local Functions - Optimize internal Python functions

📦 Installation Options

Basic Installation

pip install the-convergence

With Agent Society (RLP + SAO)

pip install "the-convergence[agents]"

With All Features

pip install "the-convergence[all]"

Development Mode

git clone https://github.com/persist-os/the-convergence.git
cd the-convergence
pip install -e ".[dev]"

🎓 Example Use Cases

1. LLM API Optimization

Optimize ChatGPT parameters for your use case:

api:
  name: "openai_chat"
  endpoint: "https://api.openai.com/v1/chat/completions"
  auth:
    type: "bearer"
    token_env: "OPENAI_API_KEY"

search_space:
  parameters:
    model: ["gpt-4o-mini", "gpt-4o"]
    temperature: [0.3, 0.5, 0.7, 0.9]
    max_tokens: [500, 1000, 2000]

evaluation:
  test_cases:
    path: "test_cases.json"
  metrics:
    quality: {weight: 0.6, type: "llm_judge"}
    latency_ms: {weight: 0.3}
    cost_usd: {weight: 0.1}

2. Context Enrichment Optimization

Optimize MAB parameters for context enrichment:

from convergence import run_optimization

config = ConvergenceConfig(
    api=ApiConfig(name="context_enrichment", endpoint="http://backend:8000/api/enrich"),
    search_space=SearchSpaceConfig(parameters={
        "threshold": {"type": "float", "min": 0.1, "max": 0.5},
        "limit": {"type": "int", "min": 5, "max": 20}
    }),
    # ... evaluation config
)

result = await run_optimization(config)

3. Runtime Per-Request Selection

Use optimized configs in production:

from convergence import configure_runtime, runtime_select, runtime_update

# Configure once
await configure_runtime("context_enrichment", config=config, storage=storage)

# Per request
selection = await runtime_select("context_enrichment", user_id="user_123")
# Use selection.params in your application

# After request
await runtime_update("context_enrichment", user_id="user_123", 
                     decision_id=selection.decision_id, reward=0.8)

🔍 How It Works (Detailed)

Optimization Flow

  1. Initialization - Load config, validate, initialize storage
  2. Generation Loop (for each generation):
    • Population Generation - Create configs (random, mutation, crossover)
    • MAB Selection - Thompson Sampling selects configs to test
    • Parallel Execution - Test configs against test cases
    • Evaluation - Score responses (quality, latency, cost)
    • Evolution - Generate next generation (elite + mutation + crossover)
    • RL Meta-Optimization - Adjust evolution parameters
    • Early Stopping - Stop if converged or max generations reached
  3. Results Export - Save best config, all results, reports

Runtime Flow

  1. Selection - Thompson Sampling selects arm (config) for request
  2. Execution - Application uses selected config
  3. Update - Record reward (quality, latency, cost)
  4. Evolution - Periodically evolve arms (mutation, crossover)

🛠️ Configuration

Minimal Config

api:
  name: "my_api"
  endpoint: "https://api.example.com/v1/endpoint"
  auth:
    type: "bearer"
    token_env: "API_KEY"

search_space:
  parameters:
    param1: {type: "float", min: 0, max: 1}

evaluation:
  test_cases:
    inline:
      - {id: "test_1", input: {}, expected: {}}
  metrics:
    accuracy: {weight: 1.0}

optimization:
  algorithm: "mab_evolution"
  evolution:
    population_size: 10
    generations: 5

See YAML_CONFIGURATION_REFERENCE.md for complete reference.

📊 Results

Results are saved in results/ directory:

  • best_config.json - Best configuration found
  • detailed_results.json - All configs tested with scores
  • detailed_results.csv - CSV export
  • report.md - Markdown report with analysis

🧪 Testing

# Run tests
pytest

# Test SDK import
python -c "from convergence import run_optimization; print('✅ Ready!')"

🤝 Contributing

See CONTRIBUTING.md for guidelines.

📝 License

Apache 2.0 - See LICENSE file.

🆘 Support

🗺️ Roadmap

  • Automated test suite with pytest
  • Performance benchmarking suite
  • Plugin development tutorial
  • Video documentation
  • Integration examples for popular APIs

🙏 Acknowledgments

Built by the PersistOS team:


Happy optimizing! 🚀

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

the_convergence-0.1.8.tar.gz (274.5 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

the_convergence-0.1.8-py3-none-any.whl (317.0 kB view details)

Uploaded Python 3

File details

Details for the file the_convergence-0.1.8.tar.gz.

File metadata

  • Download URL: the_convergence-0.1.8.tar.gz
  • Upload date:
  • Size: 274.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for the_convergence-0.1.8.tar.gz
Algorithm Hash digest
SHA256 46d82c4afda5e6222c679b5240fe5cab1a188633cee5be8f98667053e7a3550c
MD5 aa3c47c48b90717d44e4f6ef1e08871b
BLAKE2b-256 86ecbf9e7c6c2de267e85a505beb16a0dc2b16d5615d2fa8e7e53e532f40bba2

See more details on using hashes here.

Provenance

The following attestation bundles were made for the_convergence-0.1.8.tar.gz:

Publisher: publish.yaml on persist-os/the-convergence

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file the_convergence-0.1.8-py3-none-any.whl.

File metadata

  • Download URL: the_convergence-0.1.8-py3-none-any.whl
  • Upload date:
  • Size: 317.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for the_convergence-0.1.8-py3-none-any.whl
Algorithm Hash digest
SHA256 4e7911aad724fe455321344b22f9bf0af7fc4d59e4daa989175424be09c52e6f
MD5 d3887d8eba82cfe798a5913660154c18
BLAKE2b-256 74da4b972e5f60ce32441db3faffef55146e9d69215f63e91e6c65b415788727

See more details on using hashes here.

Provenance

The following attestation bundles were made for the_convergence-0.1.8-py3-none-any.whl:

Publisher: publish.yaml on persist-os/the-convergence

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page