Skip to main content

A Python library to optimize prompt drafts using LLMs

Project description

🧠 leo-prompt-optimizer

leo-prompt-optimizer is a Python library that helps developers optimize raw LLM prompts into structured, high-performance instructions using real LLM intelligence.

It leverages open-source models via Groq API (like Mixtral or LLaMA 3), making it fast, affordable, and production-ready.


🚀 Features

  • 🛠️ Refines vague, messy, or unstructured prompts
  • 🧠 Follows a 9-step prompt engineering framework
  • 🧩 Supports contextual optimization (with user input & LLM output)
  • ⚡ Runs on blazing-fast open-source LLMs via Groq
  • 🔐 Secure API key management with .env or helper function

📦 Installation

pip install leo-prompt-optimizer

🔧 Setup: API Key

You can provide your Groq API key in two ways:

✅ Option A: .env file (recommended)

At the root of your project:

GROQ_API_KEY=sk-your-api-key-here

Then use as-is:

from leo_prompt_optimizer import optimize_prompt

✅ Option B: Set programmatically

Useful in notebooks, scripts, or cloud functions:

from leo_prompt_optimizer import set_groq_api_key, optimize_prompt

set_groq_api_key("sk-your-api-key")

✍️ Usage Example

from leo_prompt_optimizer import optimize_prompt, set_groq_api_key

set_groq_api_key("sk-your-api-key")

draft = "I want to generate a structured planning for a GenAI course with communication adapted to enrolled members."

optimized = optimize_prompt(draft)
print(optimized)

📘 Output Format

The returned optimized prompt follows a structured format:

Role:
[Define the LLM's persona]

Task:
[Clearly state the specific objective]

Instructions:
* Step-by-step subtasks

Context:
[Any relevant background, constraints, domain]

Output Format:
[e.g., bullet list, JSON, summary]

User Input:
[Original user input or example]

🧪 Quick Test (Optional)

You can run a built-in test script after installation to validate that everything works:

python3 test_import.py

This will check:

  • Import is working ✅
  • API key is detected ✅
  • LLM call returns a result ✅

🧯 Common Errors & Fixes

Error Solution
Missing GROQ_API_KEY Make sure it's in .env or set with set_groq_api_key()
Invalid model or 403 The default model might be deprecated. Check Groq’s model list
ModuleNotFoundError Ensure leo-prompt-optimizer is installed in the current environment

💡 Why Use It?

Prompt quality is critical when building with LLMs.

leo-prompt-optimizer helps you:

  • ✅ Make prompts explicit and structured
  • 🚫 Reduce hallucinations
  • 🔁 Increase consistency and repeatability
  • 🧱 Standardize prompt formats across your stack

📄 License

MIT © 2025 [Léonard Baesen-Wagner]

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

leo_prompt_optimizer-0.1.2.tar.gz (5.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

leo_prompt_optimizer-0.1.2-py3-none-any.whl (5.9 kB view details)

Uploaded Python 3

File details

Details for the file leo_prompt_optimizer-0.1.2.tar.gz.

File metadata

  • Download URL: leo_prompt_optimizer-0.1.2.tar.gz
  • Upload date:
  • Size: 5.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.3

File hashes

Hashes for leo_prompt_optimizer-0.1.2.tar.gz
Algorithm Hash digest
SHA256 0d22e9fae63ccef3de445323707a3fe8a75a65b39ca3e04c560572299d4a0acd
MD5 d08fe50d66e746e0d313a65c0c4ee29e
BLAKE2b-256 26f9a6fbc9664f53faa2ce1a30ecb8c582e88ac0327bcc944655d906f69972fb

See more details on using hashes here.

File details

Details for the file leo_prompt_optimizer-0.1.2-py3-none-any.whl.

File metadata

File hashes

Hashes for leo_prompt_optimizer-0.1.2-py3-none-any.whl
Algorithm Hash digest
SHA256 6f73c43cbb287e2d5ef4101520d69e665b1b612cf1b934c3f48ad9cb41fa6f03
MD5 784b149b5404f77a1d540de1a3e3412f
BLAKE2b-256 f649bf2a88f2c5856d6c2a1ef3dde601a72c4fb82ca7b81027d1197c7a263fa0

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page