A Python library to optimize prompt drafts using LLMs
Project description
🧠 leo-prompt-optimizer
leo-prompt-optimizer is a Python library that helps developers optimize raw LLM prompts into structured, high-performance instructions using real LLM intelligence.
It leverages open-source models via Groq API (like Mixtral or LLaMA 3), making it fast, affordable, and production-ready.
🚀 Features
- 🛠️ Refines vague, messy, or unstructured prompts
- 🧠 Follows a 9-step prompt engineering framework
- 🧩 Supports contextual optimization (with user input & LLM output)
- ⚡ Runs on blazing-fast open-source LLMs via Groq
- 🔐 Secure API key management with
.envor helper function
📦 Installation
pip install leo-prompt-optimizer
🔧 Setup: API Key
You can provide your Groq API key in two ways:
✅ Option A: .env file (recommended)
At the root of your project, create a .env file:
GROQ_API_KEY=sk-your-api-key-here
Then, in your Python script (before calling the optimizer), load the .env file:
from dotenv import load_dotenv
load_dotenv() # 👈 Required to make the .env key visible
from leo_prompt_optimizer import optimize_prompt
optimized = optimize_prompt("Your draft prompt here")
⚠️ Important:
load_dotenv()must be called before importing or using the optimizer, or the key won’t be found.
✅ Option B: Set programmatically
Useful in notebooks, scripts, or cloud functions:
from leo_prompt_optimizer import set_groq_api_key, optimize_prompt
set_groq_api_key("sk-your-api-key")
optimized = optimize_prompt("Your draft prompt here")
✍️ Usage Example
🔹 Minimal use (just the prompt draft)
from dotenv import load_dotenv
load_dotenv()
from leo_prompt_optimizer import optimize_prompt
draft = "I want to generate a structured planning for a GenAI course with communication adapted to enrolled members."
optimized = optimize_prompt(draft)
print(optimized)
🔹 With optional user_input and llm_output
You can provide additional context to help the optimizer refine the prompt even better:
from leo_prompt_optimizer import set_groq_api_key, optimize_prompt
set_groq_api_key("sk-your-api-key")
draft = "Help me create a better prompt to extract user feedback from a UI test session."
user_input = "The user said the interface was 'cluttered and confusing' but didn't explain why."
llm_output = "You could ask: 'Which parts feel confusing?' or 'What would make it feel more intuitive?'"
optimized = optimize_prompt(draft, user_input=user_input, llm_output=llm_output)
print(optimized)
🧠
user_inputandllm_outputare optional, but highly recommended when refining an existing prompt flow.
📘 Output Format
The returned optimized prompt follows a structured format:
Role:
[Define the LLM's persona]
Task:
[Clearly state the specific objective]
Instructions:
* Step-by-step subtasks
Context:
[Any relevant background, constraints, domain]
Output Format:
[e.g., bullet list, JSON, summary]
User Input:
[Original user input or example]
🧪 Quick Test (Optional)
You can run a built-in test script after installation to validate that everything works:
python3 test_import.py
This will check:
- ✅ Import is working
- ✅ API key is detected
- ✅ LLM call returns a result
🧯 Common Errors & Fixes
| Error | Solution |
|---|---|
Missing GROQ_API_KEY |
Ensure it's either in .env and loaded with load_dotenv(), or passed via set_groq_api_key() |
Invalid model or 403 |
The default model might be deprecated. Check Groq’s model list |
ModuleNotFoundError |
Ensure leo-prompt-optimizer is installed in the current environment |
💡 Why Use It?
Prompt quality is critical when building with LLMs.
leo-prompt-optimizer helps you:
- ✅ Make prompts explicit and structured
- 🚫 Reduce hallucinations
- 🔁 Increase consistency and repeatability
- 🧱 Standardize prompt formats across your stack
📄 License
MIT © 2025 [Léonard Baesen-Wagner]
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file leo_prompt_optimizer-0.1.4.tar.gz.
File metadata
- Download URL: leo_prompt_optimizer-0.1.4.tar.gz
- Upload date:
- Size: 5.7 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.3
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
a99eb981659b554721e6c21520f04bc50474fbc596c0b57d3994fcf92f41ee13
|
|
| MD5 |
0a4a68301466df91d50e3b7c3c1d91ee
|
|
| BLAKE2b-256 |
e99f48e6050203a3f42b3074efb8260e0765fc51e3f9b17cf16884dbd460c114
|
File details
Details for the file leo_prompt_optimizer-0.1.4-py3-none-any.whl.
File metadata
- Download URL: leo_prompt_optimizer-0.1.4-py3-none-any.whl
- Upload date:
- Size: 6.2 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.3
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
3de8910031dabb62a8c5f6bd853becbea1e4be78458da9eb83ec7b7a30d2fae7
|
|
| MD5 |
bdc74eb4076fc2dcab21568c1470a147
|
|
| BLAKE2b-256 |
4abec0799dbf4e9bc94a946464e4e7d779b0af8441388a878619cce438cba7b8
|