A simple yet powerful abstraction for litellm and pydantic
Project description
promptic
promptic
is a lightweight, decorator-based Python library that simplifies the process of interacting with large language models (LLMs) using litellm. With promptic
, you can effortlessly create prompts, handle input arguments, and receive structured outputs from LLMs, all in under 100 lines of code.
Installation
pip install promptic
Usage
Simple Prompt
from promptic import promptic
@promptic
def us_president(year):
"""Who was the President of the United States in {year}?"""
print(us_president(2000))
# The President of the United States in 2000 was Bill Clinton until January 20th, when George W. Bush was inaugurated as the 43rd President.
Structured Output with Pydantic
from pydantic import BaseModel
from promptic import promptic
class Capital(BaseModel):
country: str
capital: str
@promptic
def get_capital(country) -> Capital:
"""What's the capital of {country}?"""
print(get_capital("France"))
# country='France' capital='Paris'
Streaming Response (and litellm integration)
from promptic import promptic
@promptic(
# keyword args are passed to litellm.completion
stream=True,
model="claude-3-haiku-20240307",
)
def haiku(subject, adjective, verb) -> str:
"""Write a haiku about {subject} that is {adjective} and {verb}."""
print("".join(haiku("programming", "witty", "delights")))
# Bugs in the code taunt,
# Syntax errors abound, yet
# Caffeine fuels the fix.
Features
- Decorator-based API: Easily define prompts using function docstrings and decorate them with
@promptic
. - Argument interpolation: Automatically interpolate function arguments into the prompt using
{argument_name}
placeholders within docstrings. - Pydantic model support: Specify the expected output structure using Pydantic models, and
promptic
will ensure the LLM's response conforms to the defined schema. - Streaming support: Receive LLM responses in real-time by setting
stream=True
when calling the decorated function. - Simplified LLM interaction: No need to remember the exact shape of the OpenAPI response object or other LLM-specific details.
promptic
abstracts away the complexities, allowing you to focus on defining prompts and receiving structured outputs.
Why promptic?
promptic
is designed to be simple, functional, and robust, providing exactly what you need 90% of the time when working with LLMs. It eliminates the need to remember the specific shapes of OpenAPI response objects or other LLM-specific details, allowing you to focus on creating prompts and receiving structured outputs.
With its legible and concise codebase, promptic
is easy to understand and extend. It leverages the power of litellm
under the hood, ensuring compatibility with a wide range of LLMs.
License
promptic
is open-source software licensed under the Apache License 2.0.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for promptic-0.1.6-py2.py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | b446ea6e2d87bd0b0724a1b1d2a2c5e1a21910b113e2a84b4912a5909ebf09d7 |
|
MD5 | 994dd94ec70175db1ac21160d7378c99 |
|
BLAKE2b-256 | e84a4f86a82d69204ac8c26db177bb604d1556140ba19fe6cc9bc76b7583f867 |