Skip to main content

PromptWeaver streamlines prompt development and management in Generative AI workflows

Project description

PromptWeaver is a Python library that streamlines prompt development and management in Generative AI workflows. It decouples prompts from Python scripts, enhancing portability, maintainability, and scalability for developers working with multiple LLMs or complex prompting workflows.

This project is currently under active development and may undergo significant changes. We welcome your feedback and contributions!

Features

  • Separate prompts from code for better modularity.
  • Supports multimodal input (text, images, audio, video).
  • Integration with large language models such as Gemini.
  • YAML-based prompt configuration with sample and default values.
  • Extensible to support other LLM APIs.

Installation

PromptWeaver can be easily installed using pip:

pip install promptweaver

Note: PromptWeaver requires Python 3.8 or higher.

Usage

First, you need to start by creating a .yml.j2 promptweaver template.

You will find template examples in our promptweaver gallery.

name: Hello World
description: A quickstart prompt showcasing how to answer a simple user message using gemini.
model:
  model_name: gemini-1.5-flash-001
  generation_config:
    temperature: 0.3
    max_output_tokens: 250
  system_instruction: You are an AI model trained to answer questions. Be kind and objective in your answers.
# ---
variables:
  user_message:
    sample: Hi!
# ---
user:
  - text: {{ user_message }}

Now you can call one of the supported LLM Clientes using the promptweaver template.

from promptweaver.core.prompt_template import PromptConfig
from promptweaver.clients.gemini.gemini_client import GeminiClient

# Initialize the Gemini client
gemini_client = GeminiClient(project="your_project", location="your_location")

# Load the prompt configuration
example_prompt = PromptConfig.from_file_with_sample_values("samples/example.yml.j2")

# Generate content
generate_content = gemini_client.generate_content(example_prompt)
print(generate_content.text)

Contributing

We welcome contributions! Please read our contributing guide for details on how to get started. The project can be found on GitHub.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

promptweaver-0.1.2.tar.gz (12.0 kB view details)

Uploaded Source

Built Distribution

promptweaver-0.1.2-py3-none-any.whl (19.3 kB view details)

Uploaded Python 3

File details

Details for the file promptweaver-0.1.2.tar.gz.

File metadata

  • Download URL: promptweaver-0.1.2.tar.gz
  • Upload date:
  • Size: 12.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: pdm/2.18.2 CPython/3.12.6 Darwin/23.6.0

File hashes

Hashes for promptweaver-0.1.2.tar.gz
Algorithm Hash digest
SHA256 a6f499e7e1f5db635e4eb02c4cd45469cc1b8844a7ba5fe536d274e998d8de98
MD5 4cf4f6f7eab1a4934c14fa86dd34dc34
BLAKE2b-256 08664a5ba3c699d1666f8ede7feed593d6794018b52299327764d5c51b2c7aaf

See more details on using hashes here.

File details

Details for the file promptweaver-0.1.2-py3-none-any.whl.

File metadata

  • Download URL: promptweaver-0.1.2-py3-none-any.whl
  • Upload date:
  • Size: 19.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: pdm/2.18.2 CPython/3.12.6 Darwin/23.6.0

File hashes

Hashes for promptweaver-0.1.2-py3-none-any.whl
Algorithm Hash digest
SHA256 b6179ba40a71b15c8dc9eb2482c2ea5f860dd04d871c546296c7aecb86a4d8fd
MD5 f6abea56897efece4c3d14d18f4c38a6
BLAKE2b-256 decb02b9820a32da3aee126da6cc3ac6d9dd05497b9355df677e98e8b2f4ddc0

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page