Skip to main content

Seamlessly integrate LLMs as Python functions

Project description

magentic

Easily integrate Large Language Models into your Python code. Simply use the @prompt decorator to create functions that return structured output from the LLM. Mix LLM queries and function calling with regular Python code to create complex logic.

magentic is

  • Compact: Query LLMs without duplicating boilerplate code.
  • Atomic: Prompts are functions that can be individually tested and reasoned about.
  • Transparent: Create "chains" using regular Python code. Define all of your own prompts.
  • Compatible: Use @prompt functions as normal functions, including with decorators like @lru_cache.
  • Type Annotated: Works with linters and IDEs.

Continue reading for sample usage, or go straight to the examples directory.

Installation

pip install magentic

or using poetry

poetry add magentic

Configure your OpenAI API key by setting the OPENAI_API_KEY environment variable or using openai.api_key = "sk-...". See the OpenAI Python library documentation for more information.

Usage

The @prompt decorator allows you to define a template for a Large Language Model (LLM) prompt as a Python function. When this function is called, the arguments are inserted into the template, then this prompt is sent to an LLM which generates the function output.

from magentic import prompt


@prompt('Add more "dude"ness to: {phrase}')
def dudeify(phrase: str) -> str:
    ...  # No function body as this is never executed


dudeify("Hello, how are you?")
# "Hey, dude! What's up? How's it going, my man?"

The @prompt decorator will respect the return type annotation of the decorated function. This can be any type supported by pydantic including a pydantic model.

from magentic import prompt
from pydantic import BaseModel


class Superhero(BaseModel):
    name: str
    age: int
    power: str
    enemies: list[str]


@prompt("Create a Superhero named {name}.")
def create_superhero(name: str) -> Superhero:
    ...


create_superhero("Garden Man")
# Superhero(name='Garden Man', age=30, power='Control over plants', enemies=['Pollution Man', 'Concrete Woman'])

An LLM can also decide to call functions. In this case the @prompt-decorated function returns a FunctionCall object which can be called to execute the function using the arguments provided by the LLM.

from typing import Literal

from magentic import prompt, FunctionCall


def activate_oven(temperature: int, mode: Literal["broil", "bake", "roast"]) -> str:
    """Turn the oven on with the provided settings."""
    return f"Preheating to {temperature} F with mode {mode}"


@prompt(
    "Prepare the oven so I can make {food}",
    functions=[activate_oven],
)
def configure_oven(food: str) -> FunctionCall[str]:
    ...


output = configure_oven("cookies!")
# FunctionCall(<function activate_oven at 0x1105a6200>, temperature=350, mode='bake')
output()
# 'Preheating to 350 F with mode bake'

Sometimes the LLM requires making one or more function calls to generate a final answer. The @prompt_chain decorator will resolve FunctionCall objects automatically and pass the output back to the LLM to continue until the final answer is reached.

In the following example, when describe_weather is called the LLM first calls the get_current_weather function, then uses the result of this to formulate its final answer which gets returned.

from magentic import prompt_chain


def get_current_weather(location, unit="fahrenheit"):
    """Get the current weather in a given location"""
    # Pretend to query an API
    return {
        "location": location,
        "temperature": "72",
        "unit": unit,
        "forecast": ["sunny", "windy"],
    }


@prompt_chain(
    "What's the weather like in {city}?",
    functions=[get_current_weather],
)
def describe_weather(city: str) -> str:
    ...


describe_weather("Boston")
# 'The current weather in Boston is 72°F and it is sunny and windy.'

LLM-powered functions created using @prompt and @prompt_chain can be supplied as functions to other @prompt/@prompt_chain decorators, just like regular python functions. This enables increasingly complex LLM-powered functionality, while allowing individual components to be tested and improved in isolation.

See the examples directory for more.

Additional Features

  • The @prompt decorator can also be used with async function definitions, which enables making concurrent queries to the LLM.
  • The Annotated type annotation can be used to provide descriptions and other metadata for function parameters. See the pydantic documentation on using Field to describe function arguments.
  • The @prompt and @prompt_chain decorators also accept a model argument. You can pass an instance of OpenaiChatModel (from magentic.chat_model.openai_chat_model) to use GPT4 or configure a different temperature.

Type Checking

Many type checkers will raise warnings or errors for functions with the prompt decorator due to the function having no body or return value. There are several ways to deal with these.

  1. Disable the check globally for the type checker. For example in mypy by disabling error code empty-body.
    # pyproject.toml
    [tool.mypy]
    disable_error_code = ["empty-body"]
    
  2. Make the function body ... (this does not satisfy mypy) or raise.
    @prompt("Choose a color")
    def random_color() -> str:
        ...
    
  3. Use comment # type: ignore[empty-body] on each function. In this case you can add a docstring instead of ....
    @prompt("Choose a color")
    def random_color() -> str:  # type: ignore[empty-body]
        """Returns a random color."""
    

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

magentic-0.1.4.tar.gz (12.3 kB view hashes)

Uploaded Source

Built Distribution

magentic-0.1.4-py3-none-any.whl (12.9 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page