Skip to main content

A Python library and CLI tool for refining prompts using various enhancement strategies.

Project description

promptrefiner Logo

Promptrefiner 🚀

Enhancing prompts with intelligent strategies for LLMs

PyPI Version GitHub Repo License Docs

🚀 Welcome to Promptrefiner

Helping you craft the perfect prompt for better LLM responses!

PromptRefiner is a lightweight Python library that helps users write better prompts for Large Language Models (LLMs) with minimal configurations. Many users struggle to craft effective prompts that yield the desired results.

PromptRefiner takes user input, applies a selected strategy, and returns an improved prompt to get more specific and effective responses from LLMs (Large Language Models). It achieves this by leveraging an LLM to refine the user’s prompt based on predefined strategies, making it easier to get high-quality responses.

Whether you're using a prompt for GPT-4, Claude, Mistral, or any other LLM, PromptRefiner ensures your input is well-structured for the best possible output.


✨ Key Features

Supports 100+ LLM Clients – Works with OpenAI, Anthropic, Hugging Face, and more!
Highly Customizable – Use different LLM clients per strategy or a single client for all.
Command-Line First – Quickly refine prompts from the CLI for rapid experimentation.
Extensible – Developers can create their own custom prompt refinement strategies.
Seamless Integration – Works effortlessly in Python applications or scripts.


📥 Installation

Install PromptRefiner using pip:

pip install promptrefiner

⚡ Quick Start

🔧 Using from the Command Line

Before using promptrefiner in Python, make sure to set environment variables (Windows users should use set instead of export):

export PREFINER_API_KEY="your-api-key-here"
export PREFINER_MODEL="openai/gpt-4"  # Change based on your LLM model

and there you go...

promptrefiner --strategy fewshot "Tell me about AI"

🐍 Using in a Python Script

Make sure to set environment variables PREFINER_API_KEY and PREFINER_MODEL before using PromptRefiner in your python script.

from promptrefiner import PromptRefiner

prompt_refiner = PromptRefiner(strategies=["persona"])
refined_prompt = prompt_refiner.refine("Explain quantum mechanics.")
print(refined_prompt)

❓ Help Section of promptrefiner

Access available list of strategies, it's alias and all required help thorugh --help option.

(env) $promptrefiner --help

PromptRefiner Help

🔍 How It Works

  1. User provides a prompt (e.g., "Tell me about AI").
  2. Selects a strategy (e.g., "verbose" for a more detailed response).
  3. PromptRefiner applies a system prompt template for that strategy.
  4. Sends it to an LLM for refinement.
  5. Returns the improved prompt back to the user.

🚀 Under the hood: Each strategy is backed by a system prompt template that guides the LLM to refine the user’s input for better results.


🤔 Why Use PromptRefiner?

🔹 Improve prompt clarity & effectiveness – Get sharper, more relevant responses.
🔹 Save time – No need to manually tweak prompts for better results.
🔹 Optimized for developers & researchers – Quickly test different prompting strategies.
🔹 Fine-tune for different LLMs – Customize strategies for specific AI models.
🔹 Works for various use cases:

  • Chatbots & AI assistants
  • Content generation & summarization
  • Data extraction from LLMs
  • Code generation improvements

🚀 Join Us & Contribute!

We welcome contributors, feedback, and feature suggestions! 🚀

📌 GitHub Repo: darshit7/promptrefiner
📌 Documentation: Promptrefiner
📌 Report Issues & Ideas: Coming Soon

👥 Want to improve PromptRefiner? Open a GitHub issue or contribute a pull request! 🛠️


🚀 Refine your prompts. Supercharge your AI interactions. Try PromptRefiner today!

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

promptrefiner-1.0.0.tar.gz (11.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

promptrefiner-1.0.0-py3-none-any.whl (16.0 kB view details)

Uploaded Python 3

File details

Details for the file promptrefiner-1.0.0.tar.gz.

File metadata

  • Download URL: promptrefiner-1.0.0.tar.gz
  • Upload date:
  • Size: 11.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.12.9

File hashes

Hashes for promptrefiner-1.0.0.tar.gz
Algorithm Hash digest
SHA256 44d542063e78bdec33eabd9d8e1a0e9cb91b0cdd200ddc499d1bdcf10d032aaa
MD5 f2be3558ff8a5dc598e4f47c2ac606da
BLAKE2b-256 06bcc2ae8cae07df9076cf3966939fb148c62811bcd2e0fbf1b6aec5b7b41e93

See more details on using hashes here.

Provenance

The following attestation bundles were made for promptrefiner-1.0.0.tar.gz:

Publisher: pypi-publish.yml on darshit7/promptrefiner

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file promptrefiner-1.0.0-py3-none-any.whl.

File metadata

  • Download URL: promptrefiner-1.0.0-py3-none-any.whl
  • Upload date:
  • Size: 16.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.12.9

File hashes

Hashes for promptrefiner-1.0.0-py3-none-any.whl
Algorithm Hash digest
SHA256 2173604a5c3e7dc167127eabdea3306310bcdbab6ffa4a40ed080057e8ea1716
MD5 603c8abfba9732bbb1352cf4bca3f4f7
BLAKE2b-256 b23671fb2d4c824a3beeabf8d7891fa806ca54254a0bac66e33e59e43ccb6f74

See more details on using hashes here.

Provenance

The following attestation bundles were made for promptrefiner-1.0.0-py3-none-any.whl:

Publisher: pypi-publish.yml on darshit7/promptrefiner

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page