coreason-optimizer
Project description
coreason-optimizer
Automated Prompt Engineering / LLM Compilation / DSPy Integration for CoReason-AI
coreason-optimizer is the "Compiler" for the CoReason Agentic Platform. It automates prompt engineering by treating prompts as trainable weights, optimizing them against ground-truth datasets to maximize performance metrics.
Installation
pip install coreason-optimizer
Features
- Automated Optimization: Rewrites instructions and selects examples to maximize a score, not human intuition.
- Optimization-as-a-Service: Run as a microservice API to compile prompts on-demand.
- Model-Specific Compilation: Generates optimized prompts specifically tuned for target models (e.g., GPT-4, Claude 3.5).
- Continuous Learning: Re-runs optimization on recent logs to patch prompts against data drift.
- Mutate-Evaluate Loop: Systematic cycle of drafting, evaluating, diagnosing, mutating, and selecting prompts.
- Strategies: Includes BootstrapFewShot (mining successful traces) and MIPRO (Multi-prompt Instruction PRoposal Optimizer).
- Integration: Works seamlessly with
coreason-construct,coreason-archive, andcoreason-assay.
For full product requirements, see docs/product_requirements.md.
Usage
You can use coreason-optimizer as a Python library, a CLI tool, or a Microservice.
1. Python Library
from coreason_optimizer import OptimizerConfig, PromptOptimizer
from coreason_optimizer.core.interfaces import Construct
from coreason_optimizer.data import Dataset
# Define Agent
class MockAgent(Construct):
inputs = ["question"]
outputs = ["answer"]
system_prompt = "You are a helpful assistant."
agent = MockAgent()
# Compile
dataset = Dataset.from_csv("data/gold_set.csv")
train_set, val_set = dataset.split(train_ratio=0.8)
optimizer = PromptOptimizer(config=OptimizerConfig(target_model="gpt-4o"))
manifest = optimizer.compile(agent, train_set, val_set)
print(f"Optimized Score: {manifest.performance_metric}")
2. Server Mode (Microservice)
Run the optimizer as a standalone service using Docker:
docker run -p 8000:8000 -e OPENAI_API_KEY=$OPENAI_API_KEY coreason-optimizer:latest
Then call the API:
curl -X POST http://localhost:8000/optimize -d @request.json
For detailed instructions, see docs/usage.md.
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file coreason_optimizer-0.3.0.tar.gz.
File metadata
- Download URL: coreason_optimizer-0.3.0.tar.gz
- Upload date:
- Size: 27.3 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
aa664a7335c0fb5e00e7df70901bd06cc72f155f65d0d3319daaa231db83aa64
|
|
| MD5 |
f698015c9594f409a75a8f617d01c973
|
|
| BLAKE2b-256 |
020ea05685293cd3eb86062cc41917f9329467e8b4db214993d823a59476050e
|
Provenance
The following attestation bundles were made for coreason_optimizer-0.3.0.tar.gz:
Publisher:
publish.yml on CoReason-AI/coreason-optimizer
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
coreason_optimizer-0.3.0.tar.gz -
Subject digest:
aa664a7335c0fb5e00e7df70901bd06cc72f155f65d0d3319daaa231db83aa64 - Sigstore transparency entry: 877364106
- Sigstore integration time:
-
Permalink:
CoReason-AI/coreason-optimizer@ec5af0817bc6856080237396683505a7aaf945a8 -
Branch / Tag:
refs/tags/v0.3.0 - Owner: https://github.com/CoReason-AI
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@ec5af0817bc6856080237396683505a7aaf945a8 -
Trigger Event:
release
-
Statement type:
File details
Details for the file coreason_optimizer-0.3.0-py3-none-any.whl.
File metadata
- Download URL: coreason_optimizer-0.3.0-py3-none-any.whl
- Upload date:
- Size: 41.4 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
bc506f24b69b7cf17e938c9ad71c0819a6a223ebf14d1249a14288521dcecf11
|
|
| MD5 |
5f51a4ebf4a1f8c71a14b358e38acd93
|
|
| BLAKE2b-256 |
3bdcae3f39ef28871ea1dcfe5a79c7c8bbdedbd1fb04252c54a949a567a14d26
|
Provenance
The following attestation bundles were made for coreason_optimizer-0.3.0-py3-none-any.whl:
Publisher:
publish.yml on CoReason-AI/coreason-optimizer
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
coreason_optimizer-0.3.0-py3-none-any.whl -
Subject digest:
bc506f24b69b7cf17e938c9ad71c0819a6a223ebf14d1249a14288521dcecf11 - Sigstore transparency entry: 877364145
- Sigstore integration time:
-
Permalink:
CoReason-AI/coreason-optimizer@ec5af0817bc6856080237396683505a7aaf945a8 -
Branch / Tag:
refs/tags/v0.3.0 - Owner: https://github.com/CoReason-AI
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@ec5af0817bc6856080237396683505a7aaf945a8 -
Trigger Event:
release
-
Statement type: