Skip to main content

coreason-optimizer

Project description

coreason-optimizer

Automated Prompt Engineering / LLM Compilation / DSPy Integration for CoReason-AI

License: Prosperity 3.0 CI Status Code Style: Ruff Documentation

coreason-optimizer is the "Compiler" for the CoReason Agentic Platform. It automates prompt engineering by treating prompts as trainable weights, optimizing them against ground-truth datasets to maximize performance metrics.


Installation

pip install coreason-optimizer

Features

  • Automated Optimization: Rewrites instructions and selects examples to maximize a score, not human intuition.
  • Optimization-as-a-Service: Run as a microservice API to compile prompts on-demand.
  • Model-Specific Compilation: Generates optimized prompts specifically tuned for target models (e.g., GPT-4, Claude 3.5).
  • Continuous Learning: Re-runs optimization on recent logs to patch prompts against data drift.
  • Mutate-Evaluate Loop: Systematic cycle of drafting, evaluating, diagnosing, mutating, and selecting prompts.
  • Strategies: Includes BootstrapFewShot (mining successful traces) and MIPRO (Multi-prompt Instruction PRoposal Optimizer).
  • Integration: Works seamlessly with coreason-construct, coreason-archive, and coreason-assay.

For full product requirements, see docs/product_requirements.md.

Usage

You can use coreason-optimizer as a Python library, a CLI tool, or a Microservice.

1. Python Library

from coreason_optimizer import OptimizerConfig, PromptOptimizer
from coreason_optimizer.core.interfaces import Construct
from coreason_optimizer.data import Dataset

# Define Agent
class MockAgent(Construct):
    inputs = ["question"]
    outputs = ["answer"]
    system_prompt = "You are a helpful assistant."
agent = MockAgent()

# Compile
dataset = Dataset.from_csv("data/gold_set.csv")
train_set, val_set = dataset.split(train_ratio=0.8)

optimizer = PromptOptimizer(config=OptimizerConfig(target_model="gpt-4o"))
manifest = optimizer.compile(agent, train_set, val_set)

print(f"Optimized Score: {manifest.performance_metric}")

2. Server Mode (Microservice)

Run the optimizer as a standalone service using Docker:

docker run -p 8000:8000 -e OPENAI_API_KEY=$OPENAI_API_KEY coreason-optimizer:latest

Then call the API:

curl -X POST http://localhost:8000/optimize -d @request.json

For detailed instructions, see docs/usage.md.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

coreason_optimizer-0.3.0.tar.gz (27.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

coreason_optimizer-0.3.0-py3-none-any.whl (41.4 kB view details)

Uploaded Python 3

File details

Details for the file coreason_optimizer-0.3.0.tar.gz.

File metadata

  • Download URL: coreason_optimizer-0.3.0.tar.gz
  • Upload date:
  • Size: 27.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for coreason_optimizer-0.3.0.tar.gz
Algorithm Hash digest
SHA256 aa664a7335c0fb5e00e7df70901bd06cc72f155f65d0d3319daaa231db83aa64
MD5 f698015c9594f409a75a8f617d01c973
BLAKE2b-256 020ea05685293cd3eb86062cc41917f9329467e8b4db214993d823a59476050e

See more details on using hashes here.

Provenance

The following attestation bundles were made for coreason_optimizer-0.3.0.tar.gz:

Publisher: publish.yml on CoReason-AI/coreason-optimizer

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file coreason_optimizer-0.3.0-py3-none-any.whl.

File metadata

File hashes

Hashes for coreason_optimizer-0.3.0-py3-none-any.whl
Algorithm Hash digest
SHA256 bc506f24b69b7cf17e938c9ad71c0819a6a223ebf14d1249a14288521dcecf11
MD5 5f51a4ebf4a1f8c71a14b358e38acd93
BLAKE2b-256 3bdcae3f39ef28871ea1dcfe5a79c7c8bbdedbd1fb04252c54a949a567a14d26

See more details on using hashes here.

Provenance

The following attestation bundles were made for coreason_optimizer-0.3.0-py3-none-any.whl:

Publisher: publish.yml on CoReason-AI/coreason-optimizer

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page