A Python library to optimize prompt drafts using LLMs
Project description
🧠 leo-prompt-optimizer
leo-prompt-optimizer is a Python library that helps developers optimize raw LLM prompts into structured, high-performance instructions using real LLM intelligence.
It leverages open-source models via Groq API (like Mixtral or LLaMA 3), and also supports OpenAI, making it fast, flexible, and production-ready.
🚀 Features
- 🛠️ Refines vague, messy, or unstructured prompts
- 🧠 Follows a 9-step prompt engineering framework
- 🧩 Supports contextual optimization (with user input & LLM output)
- 🔁 Works with both Groq and OpenAI
- ⚡ Blazing-fast open models via Groq
- 🔐 Secure API key management with
.envor helper function - 🎛️ Let users choose model (
gpt-3.5-turbo,mixtral-8x7b,llama3, etc.)
📦 Installation
pip install leo-prompt-optimizer
🔧 Setup: API Keys
You can provide your API key in two ways:
✅ Option A: .env file (recommended)
At the root of your project, create a .env file:
GROQ_API_KEY=sk-your-groq-key
or
OPENAI_API_KEY=sk-your-openai-key
Then, in your Python script:
from dotenv import load_dotenv
load_dotenv() # 👈 Required to load the API keys from .env
✅ Option B: Set programmatically
from leo_prompt_optimizer import set_groq_api_key, set_openai_api_key
set_groq_api_key("sk-your-groq-key")
set_openai_api_key("sk-your-openai-key")
✍️ Usage Example
from dotenv import load_dotenv
load_dotenv() # Only needed if using .env for API keys
from leo_prompt_optimizer import optimize_prompt, set_groq_api_key, set_openai_api_key
# Optional: Set API key manually (Groq or OpenAI)
# set_openai_api_key("sk-...")
# set_groq_api_key("sk-...")
optimized = optimize_prompt(
prompt_draft="[YOUR PROMPT]",
user_input="[POTENTIAL INPUT EXAMPLE]", # Optional
llm_output="[POTENTIAL OUTPUT EXAMPLE]", # Optional
provider="[YOUR PROVIDER]", # "groq" (default) or "openai"
model="[YOUR MODEL]", # Optional: model choie based on your provider(e.g. "gpt-4", "llama3-70b", etc.)
base_url="[YOUR BASE_URL]" # Optional: if you have a specific base url
)
print(optimized)
🧠
user_inputandllm_outputare optional but helpful when refining an existing prompt flow. 🎛️ You can also specify theprovider(groqoropenai) and the exactmodelyou want.
📘 Output Format
The returned optimized prompt follows a structured format:
Role:
[Define the LLM's persona]
Task:
[Clearly state the specific objective]
Instructions:
* Step-by-step subtasks
Context:
[Any relevant background, constraints, domain]
Output Format:
[e.g., bullet list, JSON, summary]
User Input:
[Original user input or example]
🧪 Quick Test (Optional)
python3 test_import.py
This will check:
- ✅ Import works
- ✅ API keys are detected
- ✅ LLM returns optimized result
🧯 Common Errors & Fixes
| Error | Solution |
|---|---|
Missing GROQ_API_KEY |
Ensure it's in .env and loaded with load_dotenv(), or passed via set_groq_api_key() |
Missing OPENAI_API_KEY |
Same as above, but with set_openai_api_key() |
Invalid model or 403 |
The model may be deprecated or restricted. Try another model or check Groq Models |
ModuleNotFoundError |
Ensure leo-prompt-optimizer is installed in the right Python environment |
💡 Why Use It?
Prompt quality is critical when building with LLMs.
leo-prompt-optimizer helps you:
✅ Make prompts explicit and structured 🚫 Reduce hallucinations 🔁 Increase consistency and reuse 🧱 Standardize prompt formats across your stack
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file leo_prompt_optimizer-0.1.6.tar.gz.
File metadata
- Download URL: leo_prompt_optimizer-0.1.6.tar.gz
- Upload date:
- Size: 6.3 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.3
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
0750d29bfae439d0348acee4d4070a83a7be01e0745ebb3e22acdf788b6cf184
|
|
| MD5 |
24b9aa2d09fc6b71d632eefebbd59ecf
|
|
| BLAKE2b-256 |
ccd07adf22d040c8917d1dc0b185029f2e11b388eda638d68f47c78d6b053270
|
File details
Details for the file leo_prompt_optimizer-0.1.6-py3-none-any.whl.
File metadata
- Download URL: leo_prompt_optimizer-0.1.6-py3-none-any.whl
- Upload date:
- Size: 6.7 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.3
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
f1ca94fe06f2b37d2369c73cf0a4e7991701e7f7b08294484163d9511d332384
|
|
| MD5 |
649075c275ed11a55aee21d8e66b6d71
|
|
| BLAKE2b-256 |
f62437b693814da9352a627b32fd3a44aacc14d7491190cceae2a7eb2596bee7
|