Skip to main content

Yet another prompt

Project description

yapr

yapr (Yet Another Prompt)

Installation

pip install yapr

llms?

Usage

Create an LLM provider from the API keys found in the current system env vars

from yapr import LLMProvider, RoleThread

llm_provider = LLMProvider.from_env()

Create a new role based chat thread

thread = RoleThread()
thread.post(role="user", msg="How are you?")

Chat with the LLM, store the prompt data in the namespace "foo"

response = llm_provider.chat(thread, namespace="foo")

thread.add_msg(response.msg)

Ask for a structured response

from pydantic import BaseModel

class Foo(BaseModel):
    bar: str
    baz: int

thread.post(role="user", msg="Given the {...} can you return that in JSON?")

response = llm_provider.chat(thread, namespace="foo", response_schema=Foo)
foo_parsed = response.parsed

assert type(foo_parsed) == Foo

Multimodal


Find a saved thread


Find a saved prompt


Just store prompts

from yapr import Prompt, RoleThread

thread = RoleThread()

msg = {
    "role": "user",
    "content": [
        {
            "type": "text",
            "text": "Whats in this image?",
        },
        {
            "type": "image_url",
            "image_url": {"url": f"data:image/jpeg;base64,..."},
        }
    ]
}
role_message = RoleMessage.from_openai(msg)
thread.add_msg(role_message)

response = call_openai(thread.to_openai())
response_msg = RoleMessage.from_openai(response["choices"][0]["message"])

saved_prompt = Prompt(thread, response_msg, namespace="foo")

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

mllm-0.1.1.tar.gz (7.8 kB view details)

Uploaded Source

Built Distribution

mllm-0.1.1-py3-none-any.whl (9.1 kB view details)

Uploaded Python 3

File details

Details for the file mllm-0.1.1.tar.gz.

File metadata

  • Download URL: mllm-0.1.1.tar.gz
  • Upload date:
  • Size: 7.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.5.1 CPython/3.10.1 Darwin/22.6.0

File hashes

Hashes for mllm-0.1.1.tar.gz
Algorithm Hash digest
SHA256 98e9a60621a1f407d6073d40ceba4861e539825279d65f42b62a9f9ce3af12de
MD5 a9abd061cffe9346e32d6e720256f8e4
BLAKE2b-256 3c030c8b2c6a8041b84f861da911b8b1e994a54d9d9b12e0158f1e531a1c05c9

See more details on using hashes here.

File details

Details for the file mllm-0.1.1-py3-none-any.whl.

File metadata

  • Download URL: mllm-0.1.1-py3-none-any.whl
  • Upload date:
  • Size: 9.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.5.1 CPython/3.10.1 Darwin/22.6.0

File hashes

Hashes for mllm-0.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 0282aad55e82c0ae77a0b0bfdb8869286271a2140ca42a453ef204eea48b7c57
MD5 50ad982ca9eb474833c9a44ae8ffac87
BLAKE2b-256 326add78a02240fb623dee75dc97b4d9ed59383121a901d66db67ce7a371813e

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page