Skip to main content

A Python library to optimize prompt drafts using LLMs

Project description

🧠 leo-prompt-optimizer

leo-prompt-optimizer is a Python library that helps developers optimize raw prompt drafts into structured, high-performance prompts for large language models (LLMs).

It leverages open-source models via Groq API (like LLaMA 3 or Mixtral), making it fast, affordable, and production-ready.


🚀 Features

  • 🛠️ Refines messy or vague prompts into structured, effective ones
  • 🧠 Follows a 9-step prompt engineering framework
  • 📦 Supports contextual optimization (with user input & LLM output)
  • ⚡ Uses blazing-fast open-source LLMs via Groq
  • 🔐 Secure API key handling with .env

📦 Installation

pip install leo-prompt-optimizer

🔧 Setup

  1. Create a .env file at the root of your project:
GROQ_API_KEY=your_groq_api_key_here
  1. Install dependencies:
pip install leo-prompt-optimizer

✍️ Usage Example

from leo_prompt_optimizer.optimizer import optimize_prompt

draft = "I want to understand user feedback better. Can you help me ask the right questions?"

user_input = "Users say the interface feels confusing. I want to understand what they mean exactly."

llm_output = "You could ask: 'Which parts feel confusing?' or 'How could it be more intuitive?'"

optimized = optimize_prompt(draft, user_input, llm_output)

print(optimized)

📘 Output Format

The optimized prompt follows a structured format like:

Role:
[Define the LLM's persona]

Task:
[Clearly state the specific objective]

Instructions:
* Step-by-step subtasks

Context:
[Any relevant background, constraints, domain]

Output Format:
[e.g., bullet list, JSON, summary]

User Input:
[Original user input or example]

💡 Why Use It?

Prompt quality is critical when building with LLMs. leo-prompt-optimizer helps you:

  • Make prompts explicit and usable across apps
  • Reduce hallucination
  • Increase repeatability and reliability

📄 License

MIT © 2025 [Leonard Baesen]

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

leo_prompt_optimizer-0.1.1.tar.gz (2.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

leo_prompt_optimizer-0.1.1-py3-none-any.whl (2.3 kB view details)

Uploaded Python 3

File details

Details for the file leo_prompt_optimizer-0.1.1.tar.gz.

File metadata

  • Download URL: leo_prompt_optimizer-0.1.1.tar.gz
  • Upload date:
  • Size: 2.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.3

File hashes

Hashes for leo_prompt_optimizer-0.1.1.tar.gz
Algorithm Hash digest
SHA256 1e1620a275e19f24a49b9b29db0080119e6ce8eb4050459c2488bb038a0ae749
MD5 3702c71ce10f1f58e0a227766f6a28fb
BLAKE2b-256 e2217abf4f1ea1ddf0cb76975ace4b5b59ffcf158470adeb152bcb3cb6519315

See more details on using hashes here.

File details

Details for the file leo_prompt_optimizer-0.1.1-py3-none-any.whl.

File metadata

File hashes

Hashes for leo_prompt_optimizer-0.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 8ba79c57ec427cb95be452dc7f12f9735c1feb07c62628ade0e101e3e9df18b6
MD5 0545f995eec9cdfac001283c3a1f54c4
BLAKE2b-256 854166f0296f96169a391b899d0f43b622985f6715ac2e168a299c2e9495e706

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page