Skip to main content

An easy-to-use library for creating and seinding prompts to LLMs.

Project description

ezprompt

An easy-to-use library for creating and sending prompts to various LLMs.

Features (Planned)

  • Simple prompt definition using Markdown and Jinja templating.
  • Automatic input validation against prompt templates.
  • Model selection from a wide range of providers.
  • Context length validation with suggestions for suitable models.
  • Cost estimation before sending prompts.
  • Up-to-date model information (context size, pricing).
  • Asynchronous support for concurrent prompt execution.

Installation

pip install ez-prompt

Basic Usage

import asyncio
from ezprompt import Prompt

async def main():
    # Define a prompt template (details TBD)
    prompt_template = """
    Translate the following text from {{ source_lang }} to {{ target_lang }}:

    {{ text }}
    """

    # Initialize the prompt
    my_prompt = Prompt(
        template=prompt_template,
        inputs={"source_lang": "English", "target_lang": "French", "text": "Hello, world!"},
        model="gpt-3.5-turbo"
    )

    # Check for potential issues and estimate cost
    issues, cost = await my_prompt.check()
    if issues:
        print(f"Issues found: {issues}")
    else:
        print(f"Estimated cost: ${cost:.6f}")

        # Send the prompt
        response = await my_prompt.send()
        print(f"Model response: {response}")

if __name__ == "__main__":
    # Required environment variables (e.g., OPENAI_API_KEY)
    # Set them according to the model provider you use.
    asyncio.run(main())

Contributing

Contributions are welcome! Please open an issue or submit a pull request.

License

This project is licensed under the MIT License.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ez_prompt-0.1.2.tar.gz (9.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

ez_prompt-0.1.2-py3-none-any.whl (11.0 kB view details)

Uploaded Python 3

File details

Details for the file ez_prompt-0.1.2.tar.gz.

File metadata

  • Download URL: ez_prompt-0.1.2.tar.gz
  • Upload date:
  • Size: 9.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.8.2 CPython/3.12.3 Windows/10

File hashes

Hashes for ez_prompt-0.1.2.tar.gz
Algorithm Hash digest
SHA256 9e9e1f88dcb1a0162256346b317b4d796236db63055bca1c34e29974ce58d008
MD5 18ead61710dfcc1c0bbb7851e08f3a44
BLAKE2b-256 e19b9e77c632260cbdc1e22a286bd576f0eb641a6a8e87501c27944c099235d7

See more details on using hashes here.

File details

Details for the file ez_prompt-0.1.2-py3-none-any.whl.

File metadata

  • Download URL: ez_prompt-0.1.2-py3-none-any.whl
  • Upload date:
  • Size: 11.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.8.2 CPython/3.12.3 Windows/10

File hashes

Hashes for ez_prompt-0.1.2-py3-none-any.whl
Algorithm Hash digest
SHA256 704318f452a037cd7bcde2b31dc5c0f6bab805d615f2a6bab6871b4f89a0811e
MD5 c3178dbae1d18d94647edfb6fc3491d3
BLAKE2b-256 8b216e065723875e0a333e11493cae101b1add4ce679fa7404237884e7cd217e

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page