Skip to main content

Llm function is a decorator that uses llm to assemble reusable workflows from available tools to match input and output models.

Project description

Llm function

llm_function helps you build reusable LLM functions with normal Python signatures.

You define a function with Pydantic input and output models, describe what it should do in the docstring, and provide a set of available tools. At runtime, llm_function uses workflow_auto_assembler to assemble and execute a workflow that satisfies that typed function contract.

The result is an LLM-backed function that can be reused like any other Python function, while still being grounded in explicit tools, schemas, and config.

Tool definition and discovery live in llm_function_tools.

Currently intended usage pattern for llm_function:

  • define typed tools
  • mark them with llm_function_tools
  • pass them into llm_function through a tool source
  • bundle runtime settings and tools into config
  • call the decorated function like a normal Python function
from pathlib import Path
from tempfile import TemporaryDirectory

from pydantic import BaseModel, Field

from llm_function_tools import llm_tool
from llm_function import InMemoryToolSource, PythonFileToolSource, LlmFunctionConfig, LlmRuntimeConfig, llm_function

Define available tools

The runtime can assemble workflows only from tools you expose through tool sources. Tools can be defined directly in the current notebook or kept in a separate .py file.

class GetWeatherInput(BaseModel):
    city: str = Field(..., description="City name.")


class GetWeatherOutput(BaseModel):
    forecast: str = Field(..., description="Weather forecast for the city.")


@llm_tool(tags=["weather"])
def get_weather(inputs: GetWeatherInput) -> GetWeatherOutput:
    """Get current weather for a city."""
    return GetWeatherOutput(forecast=f"Sunny in {inputs.city}")


tool_sources = [InMemoryToolSource([get_weather])]

You can also keep tools in a separate .py file and load them with PythonFileToolSource:

tmp_dir = TemporaryDirectory()
tool_file = Path(tmp_dir.name) / "weather_tools.py"

tool_file.write_text(
    """
from pydantic import BaseModel, Field
from llm_function_tools import llm_tool


class GetWeatherInput(BaseModel):
    city: str = Field(..., description='City name.')


class GetWeatherOutput(BaseModel):
    forecast: str = Field(..., description='Weather forecast for the city.')


@llm_tool(tags=['weather'])
def get_weather(inputs: GetWeatherInput) -> GetWeatherOutput:
    '''Get current weather for a city.'''
    return GetWeatherOutput(forecast=f'Sunny in {inputs.city}')
""".strip()
)

file_tool_sources = [PythonFileToolSource(str(tool_file))]
file_tool_sources
[PythonFileToolSource(file_path='/tmp/tmpva36jgth/weather_tools.py', include_plain_typed=False, location_type='local', package_name=None, package_version=None, module_name=None, loggerLvl=20, logger_name=None, logger_format='%(levelname)s:%(name)s:%(message)s')]

Create reusable config

Bundle runtime settings and tool sources once, then reuse that config across multiple decorated functions.

runtime_config = LlmRuntimeConfig(
    llm_handler_params={
        "llm_h_type": "ollama",
        "llm_h_params": {
            "connection_string": "http://localhost:11434",
            "model_name": "gpt-oss:20b",
        },
    },
    storage_path="/tmp",
)

llm_config = LlmFunctionConfig(
    runtime=runtime_config,
    tool_sources=tool_sources,
)

llm_config
LlmFunctionConfig(runtime=LlmRuntimeConfig(llm_handler_params={'llm_h_type': 'ollama', 'llm_h_params': {'connection_string': 'http://localhost:11434', 'model_name': 'gpt-oss:20b'}}, storage_path='/tmp', force_replan=False, max_retry=None, reset_loops=None, compare_params=None, test_params=None), tool_sources=[InMemoryToolSource(tools=[<function get_weather at 0x7b8c78355e10>], location_type='local', package_name=None, package_version=None, origin_ref=None, loggerLvl=20, logger_name=None, logger_format='%(levelname)s:%(name)s:%(message)s')], tool_registry=None)

Define workflow input and output schemas

The decorated function body is unused. Its signature and docstring define the target typed function contract.

class WfInputs(BaseModel):
    city: str = Field(..., description="Name of the city.")


class WfOutputs(BaseModel):
    city: str = Field(..., description="Name of the city.")
    summary: str = Field(..., description="Forcast summary for user.")
@llm_function(config=llm_config)
def get_city_weather(input: WfInputs) -> WfOutputs:
    """
    Get weather for the provided city and prepare a short user-facing forcast summary.
    """
    pass

Call the generated typed function

On each call, the decorator creates a WorkflowAutoAssembler, resolves tools from the config, calls actualize_workflow(...), and returns the typed output.

result = get_city_weather(WfInputs(city="Wrocław"))
result
WfOutputs(city='Wrocław', summary='Sunny in Wrocław')

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

llm_function-0.0.3.tar.gz (673.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

llm_function-0.0.3-py3-none-any.whl (741.3 kB view details)

Uploaded Python 3

File details

Details for the file llm_function-0.0.3.tar.gz.

File metadata

  • Download URL: llm_function-0.0.3.tar.gz
  • Upload date:
  • Size: 673.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.10.20

File hashes

Hashes for llm_function-0.0.3.tar.gz
Algorithm Hash digest
SHA256 90385a9610ff71997ec18e36164dae52d0bd9e61a9bf0c4953553bd5b2debbcc
MD5 328ab2b5c16dbcfe4e9f64e656266e15
BLAKE2b-256 d8bf0680831c30f50892592a772eb3c642509b394afb26a561db2a1996921ead

See more details on using hashes here.

File details

Details for the file llm_function-0.0.3-py3-none-any.whl.

File metadata

  • Download URL: llm_function-0.0.3-py3-none-any.whl
  • Upload date:
  • Size: 741.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.10.20

File hashes

Hashes for llm_function-0.0.3-py3-none-any.whl
Algorithm Hash digest
SHA256 e84fafaa39050ed0706b5e853f4dc190c9c2d454de39460e238ae7e9f1b5d825
MD5 3bd7d06e8c65ec1f2c757b59e7ba45e5
BLAKE2b-256 794b517291685fec1cdb2fbcd5ff31f5255bfde62fbac82b0e803d3f24543f01

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page