Skip to main content

An extension for TRAC D.A.P. that lets models connect to OpenAI endpoints

Project description

tracdap

PyPI Version Python Versions Packaging status Compliance status FINOS - Incubating

OpenAI Extension for the TRAC Model Runtime

This extension makes the OpenAI Python SDK available to use from inside a TRAC model.

  • Use the native OpenAI client classes directly in TRAC model code
  • Connection settings managed by TRAC for both local and deployed models
  • Supports both OpenAI and AzureOpenAI clients

Models that make external calls are not considered repeatable, and will be flagged as not repeatable when they run on the TRAC platform.

This extension is a pre-release and will be finalized inTRAC 0.10.

Installing

The OpenAI extension can be installed with pip:

$ pip install tracdap-ext-openai

The package has the following dependencies:

  • tracdap-runtime (version 0.10.0-beta1 or later)
  • openai (version 1.x)

Using the OpenAI client

Here is a minimum working example of a TRAC model using the OpenAI client:

import tracdap.rt.api as trac
import openai

class OpenAIModel(trac.TracModel):

    # ... define parameters, inputs and outputs

    def define_resources(self):

        return {
            "openai": trac.define_external_system("openai", openai.OpenAI),
        }

    def run_model(self, ctx: trac.TracContext):

        with ctx.get_external_system("openai", openai.OpenAI) as client:

            response = client.responses.create(
                model="gpt-4o",
                instructions="You are a coding assistant that talks like a pirate.",
                input="How do I check if a Python object is an instance of a class?",
            )

            ctx.log.info(response.output_text)

if __name__ == '__main__':
    import tracdap.rt.launch as launch
    launch.launch_model(OpenAIModel, "config/job_config.yaml", "config/sys_config.yaml")

To make this example work, you will need to add openai as a resource in the system config file:

resources:

  openai:
    resourceType: EXTERNAL_SYSTEM
    protocol: openai

The client can be customized by setting additional properties on the resource, which are passed through to the OpenAI client.

resources:

  openai:
    resourceType: EXTERNAL_SYSTEM
    protocol: openai
    properties:
      project: proj_xxxxxxxxxxxxx

The following configuration properties are supported:

  • api_key, string, required
  • organization, string, optional
  • project, string, optional
  • base_url, string, default = https://api.openai.com/v1/
  • timeout, float, defeault = openai.DEFAULT_TIMEOUT.read (currently 600 seconds)
  • max_retries, int, default = openai.DEFAULT_MAX_RETRIES (currently 2)

The api_key should not be put into a config file in plain text, for local development it is recommended to set the OPENAI_API_KEY environment variable instead. If both the config property and the environment variable are set, the config property takes precedence.

Using the AzureOpenAI client

Here is a minimum working example of a TRAC model using the AzureOpenAI client. This assumes the required resources and deployments have been set up in Azure.

import tracdap.rt.api as trac
import openai

class TestModel(trac.TracModel):

    # ... define parameters, inputs and outputs

    def define_resources(self):

        return {
            "openai_azure": trac.define_external_system("openai", openai.AzureOpenAI)
        }

    def run_model(self, ctx: trac.TracContext):

        with ctx.get_external_system("openai_azure", openai.AzureOpenAI) as client:

            completion = client.chat.completions.create(
                model="gpt-4.1-mini",
                messages=[
                    { "role": "system", "content": "You are a coding assistant that talks like a pirate."},
                    { "role": "user", "content": "How do I check if a Python object is an instance of a class?" },
                ]
            )

            ctx.log.info(completion.choices[0].message.content)

if __name__ == '__main__':
    import tracdap.rt.launch as launch
    launch.launch_model(TestModel, "config/job_config.yaml", "config/sys_config.yaml")

To make this example work, you will need to add openai_azure as a resource in the system config file:

resources:

  openai_azure:
    resourceType: EXTERNAL_SYSTEM
    protocol: openai
    subProtocol: azure
    properties:
      api_version: 2025-04-01-preview
      azure_endpoint: https://my-azure-endpoint.cognitiveservices.azure.com/

Setting supProtcol: azure is required for to create an Azure client. The api_version and azure_endpoint properties must be specified, and model parameter in the client call must refer to live model deployment on that endpoint.

All the configuration properties supported by the regular client are also supported by the Azure client. Additionally, the Azure client supports these extra properties:

  • api_version, string, required
  • azure_endpoint, string, required
  • azure_deployment, string, optional
  • azure_ad_token, string, optional

For he Azure client, if api_key is not specified in the config file it is read from the environment variable AZURE_OPENAI_API_KEY. Similarly, azure_ad_token can be read from the environment variable AZURE_OPENAI_AD_TOKEN. If both the config property and the environment variable are set, the config property takes precedence.

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

tracdap_ext_openai-0.10.0b1.tar.gz (12.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

tracdap_ext_openai-0.10.0b1-py3-none-any.whl (11.4 kB view details)

Uploaded Python 3

File details

Details for the file tracdap_ext_openai-0.10.0b1.tar.gz.

File metadata

  • Download URL: tracdap_ext_openai-0.10.0b1.tar.gz
  • Upload date:
  • Size: 12.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for tracdap_ext_openai-0.10.0b1.tar.gz
Algorithm Hash digest
SHA256 0b5534d9bd02ab38da7cbdd96f9e637c9dd7e30bb59338b5bb3c8206e7b43ed6
MD5 a80f034e3df17f996ddb9470ee334f95
BLAKE2b-256 342749ddeba0b9b01a99d495502e53b8c751ae49823767fe9bad985bb30b4a7e

See more details on using hashes here.

Provenance

The following attestation bundles were made for tracdap_ext_openai-0.10.0b1.tar.gz:

Publisher: packaging.yaml on finos/tracdap

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file tracdap_ext_openai-0.10.0b1-py3-none-any.whl.

File metadata

File hashes

Hashes for tracdap_ext_openai-0.10.0b1-py3-none-any.whl
Algorithm Hash digest
SHA256 46240224a98ecf305f8ad4620ab8ff3bcf938290adf9b7cb75d726cf84fc25a1
MD5 b6354b658f81b03a53199d360bc3137c
BLAKE2b-256 8e00ce0dca77170c0d22040d5b6ca052200f08c709ea4b63a10e50ba3acc5dfa

See more details on using hashes here.

Provenance

The following attestation bundles were made for tracdap_ext_openai-0.10.0b1-py3-none-any.whl:

Publisher: packaging.yaml on finos/tracdap

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page