Skip to main content

Function mapper for Open AI Chat

Project description

Taifun - Typed AI Functions

A simple framework for creating typed AI functions. It inspects the function's docstring and parameters to provide functions for OpenAI's API. You can either use the Taifun class to create a dict of functions to pass to OpenAI's API or use the TaifunConversationRunner class to run a conversation with functions.

Installation

pip install taifun

Usage

Initialize a Taifun instance and decorate your functions with @taifun.fn()

taifun = Taifun()

@taifun.fn()
def weather_forcast(location: str) -> str:
    """
    Get the weather forcast for a given location

    Parameters
    ----------
    location: str
        the user's location like a Ciry and State, e.g. San Francisco, CA

    """

    return f"The weather in {location} is rainy"

Then you can use the functions_as_dict method to get a dict that can be passed to OpenAI's functions field

functions = taifun.functions_as_dict()

Then you can use the handle_function_call method to handle a function call from OpenAI's API

functions is a dict that can be passed to OpenAI's functions field

[
 {
  "name": "weather_forcast",
  "description": "Get the weather forcast for a given location",
  "parameters": {
   "properties": {
    "location": {
     "description": "the user's location like a Ciry and State, e.g. San Francisco, CA",
     "title": "Location",
     "type": "string"
    }
   },
   "required": [
    "location"
   ],
   "title": "FunctionParameters",
   "type": "object"
  }
 }
]

Pass functions to the functions field of OpenAI's API

messages = [...]
result = openai.ChatCompletion.create(
    model="gpt-4",
    messages=messages,
    functions=functions,
    function_call="auto",
)

If the response from OpenAI's API has a function call, you can handle it with handle_function_call

function_call = result["choices"][0]["message"].get("function_call")

if function_call is not None:
    # handle the function call
    function_response = taifun.handle_function_call(function_call)
    # return the function response
    messages.append(
        {
            "role": "function",
            "name": function_call["name"],
            "content": function_response,
        }
    )
    # reply

Demo with functions to pass to OpenAI

taifun = Taifun()


@taifun.fn()
def weather_forcast(location: str) -> str:
    """
    Get the weather forcast for a given location

    Parameters
    ----------
    location: str
        the location to get the weather forcast for

    """

    # random weather
    weather = random.choice(["sunny", "rainy", "cloudy", "snowy"])

    return f"The weather in {location} is {weather}"


messages = [
    {
        "role": "user",
        "content": "Is it rainingy in berlin today?",
    },
]

# export functions as json schema dict for openai
functions = taifun.functions_as_dict()


result = openai.ChatCompletion.create(
    model="gpt-4",
    messages=messages,
    functions=functions,
    function_call="auto",
)
response_message = result["choices"][0]["message"]

print(f"assistant: {response_message['content']}")

function_call = response_message.get("function_call")

messages.append(response_message)
if function_call is not None:
    # handle the function call
    function_response = taifun.handle_function_call(function_call)

    # responed with the function response
    print(f"function response: {function_response}")
    messages.append(
        {
            "role": "function",
            "name": function_call["name"],
            "content": function_response,
        }
    )

    result2 = openai.ChatCompletion.create(
        model="gpt-4",
        messages=messages,
        functions=functions,
        function_call="auto",
    )
    response_message2 = result2["choices"][0]["message"]
    print(f"assistant: {response_message2['content']}")

A full example including the TaifunConversationRunner

taifun = Taifun()


@taifun.fn()
def get_location() -> str:
    """
    Get the user's location

    returns: the user's location like a Ciry and State, e.g. San Francisco, CA
    """
    location = Prompt.ask("What is your location?")

    return location


@taifun.fn()
def get_lang_lat(location: str) -> dict:
    """
    Get the latitude and longitude of a location

    Parameters
    ----------
    location: str 
        the user's location like a Ciry and State, e.g. San Francisco, CA

    """

    response = httpx.get(
        f"https://nominatim.openstreetmap.org/search/{urlparse.quote(location)}",
        params={
            "format": "json",
        },
    )
    response.raise_for_status()
    data = response.json()
    lat = data[0]["lat"]
    lng = data[0]["lon"]

    return {"latitute": lat, "longitude": lng}


class Coordinates(BaseModel):
    latitude: float = Field(
        ..., title="Latitude", description="The latitude of a location"
    )
    longitude: float = Field(
        ..., title="Longitude", description="The longitude of a location"
    )


@taifun.fn()
def get_current_weather(coordinates: Coordinates):
    """Get the current weather in a given longitude and latitude

    Parameters
    ----------
    coordinates: Coordinates
        the latitude and longitude of a location

    Returns:
        dict: a dictionary of the current weather

    """

    response = httpx.get(
        "https://api.open-meteo.com/v1/forecast",
        params={
            "latitude": coordinates.latitude,
            "longitude": coordinates.longitude,
            "current_weather": True,
        },
    )
    response.raise_for_status()
    data = response.json()
    return data


if __name__ == "__main__":
    openai.api_key_path = os.path.expanduser("~") + "/.openai_api_key"
    runner = TaifunConversationRunner(taifun)
    result = runner.run("Will I need an umbrella today?")

    rich.print(result)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

taifun-0.2.0.tar.gz (4.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

taifun-0.2.0-py3-none-any.whl (5.7 kB view details)

Uploaded Python 3

File details

Details for the file taifun-0.2.0.tar.gz.

File metadata

  • Download URL: taifun-0.2.0.tar.gz
  • Upload date:
  • Size: 4.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.5.1 CPython/3.11.4 Linux/6.2.0-1012-azure

File hashes

Hashes for taifun-0.2.0.tar.gz
Algorithm Hash digest
SHA256 76cabde4dbf1f890e9bded40855bb28af32ffb3905c35d78c0551052f5778527
MD5 94091f43be1a0787c9f24c528821b4c9
BLAKE2b-256 723b82cbb0014557fb3ed009d695a6b94f549ea2767efb385c8904c58e3dad59

See more details on using hashes here.

File details

Details for the file taifun-0.2.0-py3-none-any.whl.

File metadata

  • Download URL: taifun-0.2.0-py3-none-any.whl
  • Upload date:
  • Size: 5.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.5.1 CPython/3.11.4 Linux/6.2.0-1012-azure

File hashes

Hashes for taifun-0.2.0-py3-none-any.whl
Algorithm Hash digest
SHA256 123dd5bdaa069e0d558263d262f134222d9d78b03adca7ee4f7272be3139f59f
MD5 2500d4e4c2dbf3888fb6d6a58b169e3f
BLAKE2b-256 ef9e9a8f67f35cb32eedc6ca7b1c4d480764032e1c2a6b4832875450dfec3f4e

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page