Skip to main content

Llm function is a decorator that uses llm to assemble reusable workflows from available tools to match input and output models.

Project description

Llm function

llm_function helps you build reusable LLM functions with normal Python signatures.

You define a function with Pydantic input and output models, describe what it should do in the docstring, and provide a set of available tools. At runtime, llm_function uses workflow_auto_assembler to assemble and execute a workflow that satisfies that typed function contract.

The result is an LLM-backed function that can be reused like any other Python function, while still being grounded in explicit tools, schemas, and config.

Tool definition and discovery live in llm_function_tools.

Currently intended usage pattern for llm_function:

  • define typed tools
  • mark them with llm_function_tools
  • pass them into llm_function through a tool source
  • bundle runtime settings and tools into config
  • call the decorated function like a normal Python function
from pathlib import Path
from tempfile import TemporaryDirectory

from pydantic import BaseModel, Field

from llm_function_tools import llm_tool
from llm_function import InMemoryToolSource, PythonFileToolSource, LlmFunctionConfig, LlmRuntimeConfig, llm_function

Define available tools

The runtime can assemble workflows only from tools you expose through tool sources. Tools can be defined directly in the current notebook or kept in a separate .py file.

class GetWeatherInput(BaseModel):
    city: str = Field(..., description="City name.")


class GetWeatherOutput(BaseModel):
    forecast: str = Field(..., description="Weather forecast for the city.")


@llm_tool(tags=["weather"])
def get_weather(inputs: GetWeatherInput) -> GetWeatherOutput:
    """Get current weather for a city."""
    return GetWeatherOutput(forecast=f"Sunny in {inputs.city}")


tool_sources = [InMemoryToolSource([get_weather])]

You can also keep tools in a separate .py file and load them with PythonFileToolSource:

tmp_dir = TemporaryDirectory()
tool_file = Path(tmp_dir.name) / "weather_tools.py"

tool_file.write_text(
    """
from pydantic import BaseModel, Field
from llm_function_tools import llm_tool


class GetWeatherInput(BaseModel):
    city: str = Field(..., description='City name.')


class GetWeatherOutput(BaseModel):
    forecast: str = Field(..., description='Weather forecast for the city.')


@llm_tool(tags=['weather'])
def get_weather(inputs: GetWeatherInput) -> GetWeatherOutput:
    '''Get current weather for a city.'''
    return GetWeatherOutput(forecast=f'Sunny in {inputs.city}')
""".strip()
)

file_tool_sources = [PythonFileToolSource(str(tool_file))]
file_tool_sources
[PythonFileToolSource(file_path='/tmp/tmpva36jgth/weather_tools.py', include_plain_typed=False, location_type='local', package_name=None, package_version=None, module_name=None, loggerLvl=20, logger_name=None, logger_format='%(levelname)s:%(name)s:%(message)s')]

Create reusable config

Bundle runtime settings and tool sources once, then reuse that config across multiple decorated functions.

runtime_config = LlmRuntimeConfig(
    llm_handler_params={
        "llm_h_type": "ollama",
        "llm_h_params": {
            "connection_string": "http://localhost:11434",
            "model_name": "gpt-oss:20b",
        },
    },
    storage_path="/tmp",
)

llm_config = LlmFunctionConfig(
    runtime=runtime_config,
    tool_sources=tool_sources,
)

llm_config
LlmFunctionConfig(runtime=LlmRuntimeConfig(llm_handler_params={'llm_h_type': 'ollama', 'llm_h_params': {'connection_string': 'http://localhost:11434', 'model_name': 'gpt-oss:20b'}}, storage_path='/tmp', force_replan=False, max_retry=None, reset_loops=None, compare_params=None, test_params=None), tool_sources=[InMemoryToolSource(tools=[<function get_weather at 0x7b8c78355e10>], location_type='local', package_name=None, package_version=None, origin_ref=None, loggerLvl=20, logger_name=None, logger_format='%(levelname)s:%(name)s:%(message)s')], tool_registry=None)

Define workflow input and output schemas

The decorated function body is unused. Its signature and docstring define the target typed function contract.

class WfInputs(BaseModel):
    city: str = Field(..., description="Name of the city.")


class WfOutputs(BaseModel):
    city: str = Field(..., description="Name of the city.")
    summary: str = Field(..., description="Forcast summary for user.")
@llm_function(config=llm_config)
def get_city_weather(input: WfInputs) -> WfOutputs:
    """
    Get weather for the provided city and prepare a short user-facing forcast summary.
    """
    pass

Call the generated typed function

On each call, the decorator creates a WorkflowAutoAssembler, resolves tools from the config, calls actualize_workflow(...), and returns the typed output.

result = get_city_weather(WfInputs(city="Wrocław"))
result
WfOutputs(city='Wrocław', summary='Sunny in Wrocław')

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

llm_function-0.0.2.tar.gz (673.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

llm_function-0.0.2-py3-none-any.whl (741.1 kB view details)

Uploaded Python 3

File details

Details for the file llm_function-0.0.2.tar.gz.

File metadata

  • Download URL: llm_function-0.0.2.tar.gz
  • Upload date:
  • Size: 673.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.10.20

File hashes

Hashes for llm_function-0.0.2.tar.gz
Algorithm Hash digest
SHA256 8dee9f9738d97358177146cba5b62b077f6b972fa786b5163773a7d8a90758ba
MD5 5af4abd34072cbf52ea09af465fa9164
BLAKE2b-256 38829d41008e5bfdcbdd22cd43241b6cbff02926808d40c3aacd4a465165678b

See more details on using hashes here.

File details

Details for the file llm_function-0.0.2-py3-none-any.whl.

File metadata

  • Download URL: llm_function-0.0.2-py3-none-any.whl
  • Upload date:
  • Size: 741.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.10.20

File hashes

Hashes for llm_function-0.0.2-py3-none-any.whl
Algorithm Hash digest
SHA256 d23314492c47defbaef633e45c16446abc8ab124a506d671b57e92c3eeaf946d
MD5 3a9aebca356b3b273133d949415cdbf5
BLAKE2b-256 16628b5077df3227b909b9c7646fcd36858ccb291cf23066f469f9b4858a25a3

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page