Skip to main content

Yet another prompt

Project description

yapr

yapr (Yet Another Prompt)

Installation

pip install yapr

Usage

A complete chat example

from yapr import LLMProvider, RoleThread

# Create an LLM provider from the API keys found in the current system env vars
llm_provider = LLMProvider.from_env()

# Create a new role based chat thread
thread = RoleThread()
thread.post(role="user", msg="How are you?")

# Chat with the LLM, store the prompt data in the namespace "foo"
response = llm_provider.chat(thread, namespace="foo")

# Add the response message to the thread
thread.add_msg(response.msg)

# Ask for a structured response
from pydantic import BaseModel

class Foo(BaseModel):
    bar: str
    baz: int

thread.post(role="user", msg="Given the {...} can you return that in JSON?")

# Chat with the LLM, requiring the output be parsable into the Foo object
response = llm_provider.chat(thread, namespace="foo", response_schema=Foo)

# Get the parsed response
foo_parsed = response.parsed

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

yapr-0.1.1.tar.gz (7.5 kB view hashes)

Uploaded Source

Built Distribution

yapr-0.1.1-py3-none-any.whl (8.9 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page