Skip to main content

A Python library to optimize prompt drafts using LLMs

Project description

🧠 leo-prompt-optimizer

leo-prompt-optimizer is a Python library that helps developers optimize raw LLM prompts into structured, high-performance instructions using real LLM intelligence.

It leverages open-source models via Groq API (like Mixtral or LLaMA 3), and also supports OpenAI, making it fast, flexible, and production-ready.


🚀 Features

  • 🛠️ Refines vague, messy, or unstructured prompts
  • 🧠 Follows a 9-step prompt engineering framework
  • 🧩 Supports contextual optimization (with user input & LLM output)
  • 🔁 Works with both Groq and OpenAI
  • ⚡ Blazing-fast open models via Groq
  • 🔐 Secure API key management with .env or helper function
  • 🎛️ Let users choose model (gpt-3.5-turbo, mixtral-8x7b, llama3, etc.)

📦 Installation

pip install leo-prompt-optimizer

🔧 Setup: API Keys

You can provide your API key in two ways:

✅ Option A: .env file (recommended)

At the root of your project, create a .env file:

GROQ_API_KEY=sk-your-groq-key
or
OPENAI_API_KEY=sk-your-openai-key

Then, in your Python script:

from dotenv import load_dotenv
load_dotenv()  # 👈 Required to load the API keys from .env

✅ Option B: Set programmatically

from leo_prompt_optimizer import set_groq_api_key, set_openai_api_key

set_groq_api_key("sk-your-groq-key")
set_openai_api_key("sk-your-openai-key")

✍️ Usage Example

from dotenv import load_dotenv
load_dotenv()  # Only needed if using .env for API keys

from leo_prompt_optimizer import optimize_prompt, set_groq_api_key, set_openai_api_key

# Optional: Set API key manually (Groq or OpenAI)
# set_openai_api_key("sk-...")
# set_groq_api_key("sk-...")

optimized = optimize_prompt(
    prompt_draft="[YOUR PROMPT]",
    user_input="[POTENTIAL INPUT EXAMPLE]", # Optional
    llm_output="[POTENTIAL OUTPUT EXAMPLE]", # Optional
    provider="[YOUR PROVIDER]",               # "groq" (default) or "openai"
    model="[YOUR MODEL]",            # Optional: model choie based on your provider(e.g. "gpt-4", "llama3-70b", etc.)
    base_url="[YOUR BASE_URL]"       # Optional: if you have a specific base url
)

print(optimized)

🧠 user_input and llm_output are optional but helpful when refining an existing prompt flow. 🎛️ You can also specify the provider (groq or openai) and the exact model you want.


📘 Output Format

The returned optimized prompt follows a structured format:

Role:
[Define the LLM's persona]

Task:
[Clearly state the specific objective]

Instructions:
* Step-by-step subtasks

Context:
[Any relevant background, constraints, domain]

Output Format:
[e.g., bullet list, JSON, summary]

User Input:
[Original user input or example]

🧪 Quick Test (Optional)

python3 test_import.py

This will check:

  • ✅ Import works
  • ✅ API keys are detected
  • ✅ LLM returns optimized result

🧯 Common Errors & Fixes

Error Solution
Missing GROQ_API_KEY Ensure it's in .env and loaded with load_dotenv(), or passed via set_groq_api_key()
Missing OPENAI_API_KEY Same as above, but with set_openai_api_key()
Invalid model or 403 The model may be deprecated or restricted. Try another model or check Groq Models
ModuleNotFoundError Ensure leo-prompt-optimizer is installed in the right Python environment

💡 Why Use It?

Prompt quality is critical when building with LLMs.

leo-prompt-optimizer helps you:

✅ Make prompts explicit and structured 🚫 Reduce hallucinations 🔁 Increase consistency and reuse 🧱 Standardize prompt formats across your stack


Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

leo_prompt_optimizer-0.1.6.tar.gz (6.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

leo_prompt_optimizer-0.1.6-py3-none-any.whl (6.7 kB view details)

Uploaded Python 3

File details

Details for the file leo_prompt_optimizer-0.1.6.tar.gz.

File metadata

  • Download URL: leo_prompt_optimizer-0.1.6.tar.gz
  • Upload date:
  • Size: 6.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.3

File hashes

Hashes for leo_prompt_optimizer-0.1.6.tar.gz
Algorithm Hash digest
SHA256 0750d29bfae439d0348acee4d4070a83a7be01e0745ebb3e22acdf788b6cf184
MD5 24b9aa2d09fc6b71d632eefebbd59ecf
BLAKE2b-256 ccd07adf22d040c8917d1dc0b185029f2e11b388eda638d68f47c78d6b053270

See more details on using hashes here.

File details

Details for the file leo_prompt_optimizer-0.1.6-py3-none-any.whl.

File metadata

File hashes

Hashes for leo_prompt_optimizer-0.1.6-py3-none-any.whl
Algorithm Hash digest
SHA256 f1ca94fe06f2b37d2369c73cf0a4e7991701e7f7b08294484163d9511d332384
MD5 649075c275ed11a55aee21d8e66b6d71
BLAKE2b-256 f62437b693814da9352a627b32fd3a44aacc14d7491190cceae2a7eb2596bee7

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page