Skip to main content

python openai functions tooling

Project description

openai-functools

openai-functools is a Python library designed to enhance the functionality of OpenAI's gpt-3.5-turbo-0613 and gpt-4-0613 models for function calling. This library focuses on generating the required JSON/dictionary/metadata automatically by wrapping existing Python functions in our decorator. This removes the need for you to manually create and manage the JSON structures required for function calling in these models.

Why openai-functools

example

Installation

This package is hosted on PyPI and can be installed with pip:

pip install openai-functools

Alternatively, you can clone this repository and install with Poetry:

git clone https://github.com/Jakob-98/openai-functools.git
cd openai-functools
poetry install

Ensure your environment variable OPENAI_API_KEY is set.

Usage

This library is designed to streamline the usage of OpenAI's language models by simplifying the function metadata creation process. The following sections will walk you through a basic usage of openai-functools, including a traditional manual approach and our enhanced automatic approach using the openai_function decorator.

Manual Approach

Traditionally, you'd define a function, like get_current_weather, and then manually create a JSON structure that describes this function. The structure includes the function name, description, and parameters it takes, as well as the types of these parameters.

def get_current_weather(location, unit="fahrenheit"):
    weather_info = {
        "location": location,
        "temperature": "72",
        "unit": unit,
        "forecast": ["sunny", "windy"],
    }
    return json.dumps(weather_info)

def run_conversation():
    messages = [{"role": "user", "content": "What's the weather like in London?"}]
    functions = [
        {
            "name": "get_current_weather",
            "description": "Get the current weather in a given location",
            "parameters": {
                "type": "object",
                "properties": {
                    "location": {
                        "type": "string",
                        "description": "The city and state, e.g. San Francisco, CA",
                    },
                    "unit": {"type": "string", "enum": ["celsius", "fahrenheit"]},
                },
                "required": ["location"],
            },
        }
    ]
    # Proceed with calling openai, invoking the function using the response, etc..

Enhanced Automatic Approach

The openai-functools library simplifies the process by automatically generating the necessary JSON structure. You just need to import our package and wrap your function with the openai_function decorator. Here's how it works:

import json
from openai_functools import openai_function

@openai_function
def get_current_weather(location, unit="fahrenheit"):
    weather_info = {
        "location": location,
        "temperature": "72",
        "unit": unit,
        "forecast": ["sunny", "windy"],
    }
    return json.dumps(weather_info)

def run_conversation():
    messages = [{"role": "user", "content": "What's the weather like in London?"}]
    functions = [
        get_current_weather.openai_metadata
    ]

As you can see, our openai_function decorator allows you to focus more on the logic of your function, while the tedious task of preparing function metadata is taken care of automatically.

Using the Orchestrator

The orchestrator in openai-functools simplifies the task of managing multiple registered functions and automates the generation of OpenAI function descriptions. Below is a guide on how to use it.

from openai_functools import FunctionsOrchestrator

def get_current_weather(location, unit="fahrenheit"):
    """Get the current weather forecast in a given location"""
    # ... Implementation here

def get_weather_next_day(location, unit="fahrenheit"):
    """Get the weather forecast for the next day in a given location"""
    # ... Implementation here

orchestrator = FunctionsOrchestrator()
orchestrator.register_all([get_current_weather, get_weather_next_day])
# ...

Registering Functions

Functions can be registered using the register_all or register method as shown in the code snippet above. register_all accepts a list of functions, while register is used to register a single function.

Creating and Using Function Descriptions

Function descriptions are automatically created based on the registered functions using create_function_descriptions method. These descriptions can then be passed to the OpenAI ChatCompletion.create method.

response = openai.ChatCompletion.create(
    model="gpt-3.5-turbo-0613",
    messages=[{"role": "user", "content": "What's the weather like in Boston?"}],
    functions=orchestrator.create_function_descriptions(),
    function_call="auto",
)

Calling Functions Based on the OpenAI Response

The call_function method is used to call a function based on the OpenAI response. It fetches the function call data from the response, finds the matching function from the registered functions, and calls it with the provided arguments.

function_results = orchestrator.call_function(response)

This process can be repeated for subsequent interactions with the OpenAI model, allowing easy use of multiple functions in a conversational context.

response = openai.ChatCompletion.create(
    model="gpt-3.5-turbo-0613",
    messages=[{"role": "user", "content": "What's the weather like in Boston tomorrow?"}],
    functions=orchestrator.create_function_descriptions(),
    function_call="auto",
)
function_results = orchestrator.call_function(response)

Using docstrings to enhance metadata

By using docstrings in your functions, we are able to extract more information to fill in the descriptions of the function and its properties. This will automatically be added to the openai function metadata, and will help the model better understand the functions and parameters.

Currently, only "reStructuredText" (reST) is supported by default, although this can be extended in the future (feel free to contribute!). Under the hood we make use of docstring parser to enable this.

Examples

Several examples can be found in the examples directory of this repository.

Contributing

We welcome contributions to openai-functools! Please see our contributing guide for more details.

Support

For support with openai-functools, please open an issue on this GitHub repository. We will do our best to assist you.

License

openai-functools is licensed under the MIT license. See the LICENSE file for details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

openai_functools-1.0.80.tar.gz (5.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

openai_functools-1.0.80-py3-none-any.whl (7.5 kB view details)

Uploaded Python 3

File details

Details for the file openai_functools-1.0.80.tar.gz.

File metadata

  • Download URL: openai_functools-1.0.80.tar.gz
  • Upload date:
  • Size: 5.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.5.1 CPython/3.9.17 Linux/5.15.0-1041-azure

File hashes

Hashes for openai_functools-1.0.80.tar.gz
Algorithm Hash digest
SHA256 7d62727c970f400d36c652ce58f6220bfa7c913cd7169c298389c59a5c68fa92
MD5 590faeff7176be0f9ce3b829518194a6
BLAKE2b-256 7abf794f7c99888435f1d6faefa352e9bd2ce56ed27fe47586fc95ee0b405856

See more details on using hashes here.

File details

Details for the file openai_functools-1.0.80-py3-none-any.whl.

File metadata

  • Download URL: openai_functools-1.0.80-py3-none-any.whl
  • Upload date:
  • Size: 7.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.5.1 CPython/3.9.17 Linux/5.15.0-1041-azure

File hashes

Hashes for openai_functools-1.0.80-py3-none-any.whl
Algorithm Hash digest
SHA256 3712629f4809e5dc7a5db791fc9a7871e5e7c3131df40e8dbea92c5c878ec6d4
MD5 a2c8958e6da230d123a4296bcd86d587
BLAKE2b-256 456895c4372c33dfa5e1e48b0889e93c8251127bb9f1f0b02e817ede7e0992c8

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page