Skip to main content

PromptWeaver streamlines prompt development and management in Generative AI workflows

Project description

PromptWeaver is a Python library that streamlines prompt development and management in Generative AI workflows. It decouples prompts from Python scripts, enhancing portability, maintainability, and scalability for developers working with multiple LLMs or complex prompting workflows.

This project is currently under active development and may undergo significant changes. We welcome your feedback and contributions!

Features

  • Separate prompts from code for better modularity.
  • Supports multimodal input (text, images, audio, video).
  • Integration with large language models such as Gemini.
  • YAML-based prompt configuration with sample and default values.
  • Extensible to support other LLM APIs.

Installation

PromptWeaver can be easily installed using pip:

pip install promptweaver

Note: PromptWeaver requires Python 3.8 or higher.

Usage

First, you need to start by creating a .yml.j2 promptweaver template.

You will find template examples in our promptweaver gallery.

name: Hello World
description: A quickstart prompt showcasing how to answer a simple user message using gemini.
model:
  model_name: gemini-1.5-flash-001
  generation_config:
    temperature: 0.3
    max_output_tokens: 250
  system_instruction: You are an AI model trained to answer questions. Be kind and objective in your answers.
# ---
variables:
  user_message:
    sample: Hi!
# ---
user:
  - text: {{ user_message }}

Now you can call one of the supported LLM Clientes using the promptweaver template.

from promptweaver.core.prompt_template import PromptConfig
from promptweaver.clients.gemini.gemini_client import GeminiClient

# Initialize the Gemini client
gemini_client = GeminiClient(project="your_project", location="your_location")

# Load the prompt configuration
example_prompt = PromptConfig.from_file_with_sample_values("samples/example.yml.j2")

# Generate content
generate_content = gemini_client.generate_content(example_prompt)
print(generate_content.text)

Contributing

We welcome contributions! Please read our contributing guide for details on how to get started. The project can be found on GitHub.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

promptweaver-0.1.12.tar.gz (11.5 kB view details)

Uploaded Source

Built Distribution

promptweaver-0.1.12-py3-none-any.whl (17.7 kB view details)

Uploaded Python 3

File details

Details for the file promptweaver-0.1.12.tar.gz.

File metadata

  • Download URL: promptweaver-0.1.12.tar.gz
  • Upload date:
  • Size: 11.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: pdm/2.18.2 CPython/3.10.12 Linux/6.8.0-1014-azure

File hashes

Hashes for promptweaver-0.1.12.tar.gz
Algorithm Hash digest
SHA256 cba09b3a2f71122b14c2c60435e7cdd8111310c4f6ebeecfca03d9cf076f4ea2
MD5 4fb51f69a9daad0dff45042138b33e4d
BLAKE2b-256 109ec0abe65fdb20d5727fba222216d2051f1366e1e195498e04d77d5a46b5d7

See more details on using hashes here.

File details

Details for the file promptweaver-0.1.12-py3-none-any.whl.

File metadata

  • Download URL: promptweaver-0.1.12-py3-none-any.whl
  • Upload date:
  • Size: 17.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: pdm/2.18.2 CPython/3.10.12 Linux/6.8.0-1014-azure

File hashes

Hashes for promptweaver-0.1.12-py3-none-any.whl
Algorithm Hash digest
SHA256 3575403a70a5c88b35edb9cd44e4da415ca32942357bdb82b9b80b12df3b304d
MD5 b8e23ebeeb402b8a6ac7f329865c4454
BLAKE2b-256 add50df97ffa24c187ad3542954ac53e53be455c3132ea212a4a30ba99f88824

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page