Skip to main content

No project description provided

Project description

Python LLM Function Client

The purpose of this library is to simplify using function calling with OpenAI-like API clients. Traditionally, you would have to rewrite your functions into JSON Schema and write logic to handle tool calls in responses. With this library, you can convert python functions into JSON schema by simply calling to_tool(func) or you can create a client that will handle those tool calls for you and simply pass back a response once the tool call chain is finished by creating an instance of FunctionClient.

Requirements for Functions

Functions used with this library must have type annotations for each parameter. You do not have to have an annotation for the return type of the function. Currently, the supported types are string, int, StrEnum and IntEnum. If the type is a StrEnum or IntEnum, the valid values will be included as part of the function tool spec.

Optionally, you can include a docstring to add descriptions. The first line of the docstring will be considered the description of the function. Subsequent lines should be of the format <parameter_name>: <description>

For example:

def get_weather(location: str):
  """
  Gets the weather

  location: where to get the forecast for
  """
  return f"The weather in {location} is 75 degrees"

This function will have "Gets the weather" as the function description and the location parameter will have the description "where to get the forecase for"

FunctionClient

The FunctionClient class is made to abstract away the logic of passing along tool calls by taking in a list of functions that are allowed to be called by the LLM client, running any tool calls required by LLM client responses until it is left with just text to respond with.

from llmfunctionclient import FunctionClient
from openai import OpenAI

def get_weather(location: str):
  """
  Gets the weather

  location: where to get the forecast for
  """
  return f"The weather in {location} is 75 degrees"

client = FunctionClient(OpenAI(), "gpt-3.5-turbo", [get_weather])
client.add_message("You are a helpful weather assistant.", "system")
response = client.send_message("What's the weather in LA?", "user")
print(response) # "The current weather in Los Angeles is 75 degrees"

When this is run, the following happens under the hood:

  1. The two message specified here will be submitted to the LLM Client
  2. The LLM Client responds with a tool call for "get_weather"
  3. The get_weather function is called and the result is appended as a message
  4. The LLM Client is called again with the function result.
  5. The LLM Client Responds with an informed answer.
  6. This response text is passed back.

to_tool

If you want to continue using any other LLM clients and just want the ability to convert python functions into JSON Schema compatible with the function calling spec, you can simply import the function to_tool and call that on the function.

Example:

def get_weather(location: str):
  """
  Gets the weather

  location: where to get the forecast for
  """
  return f"The weather in {location} is 75 degrees"

Calling to_tool(get_weather) returns the following object

{'type': 'function',
 'function': {'name': 'get_weather',
  'parameters': {'type': 'object',
   'properties': {'location': {'type': 'string',
     'description': 'where to get the forecast for'}},
   'required': ['location']}},
 'description': 'Gets the weather'}

This can then be used with the normal OpenAI client like this:

messages = [{"role": "user", "content": "What's the weather like in Boston today?"}]
completion = client.chat.completions.create(
  model="gpt-3.5-turbo",
  messages=messages,
  tools=[to_tool(get_weather)],
  tool_choice="auto"
)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

llmfunctionclient-0.1.0.tar.gz (3.4 kB view details)

Uploaded Source

Built Distribution

llmfunctionclient-0.1.0-py3-none-any.whl (3.9 kB view details)

Uploaded Python 3

File details

Details for the file llmfunctionclient-0.1.0.tar.gz.

File metadata

  • Download URL: llmfunctionclient-0.1.0.tar.gz
  • Upload date:
  • Size: 3.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.8.2 CPython/3.11.6 Darwin/23.4.0

File hashes

Hashes for llmfunctionclient-0.1.0.tar.gz
Algorithm Hash digest
SHA256 1ce450b9f509555fa8dc887a7d47ec95526d21890adbeda2d0e9000b5ef67558
MD5 5161cb2ad7f735f41232daa3a995ee23
BLAKE2b-256 636a59210be3e1ed70322052b93a022e1452dd0e6a6b41297c5e40ba5c4b8a28

See more details on using hashes here.

File details

Details for the file llmfunctionclient-0.1.0-py3-none-any.whl.

File metadata

File hashes

Hashes for llmfunctionclient-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 2a401c4cb0add589a3f70ae3bdd47f461b0a80ff5eb64c214e8eedd7acd44270
MD5 7d0bc9a3a2dc7bf7cd602517edad5534
BLAKE2b-256 0429b510bbc13c279c26084e2919371f6be1631a1546ef753ec1f739a1021145

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page