Skip to main content

An extension for TRAC D.A.P. that lets models connect to OpenAI endpoints

Project description

tracdap

PyPI Version Python Versions Packaging status Compliance status FINOS - Incubating

OpenAI Extension for the TRAC Model Runtime

This extension makes the OpenAI Python SDK available to use from inside a TRAC model.

  • Use the native OpenAI client classes directly in TRAC model code
  • Connection settings managed by TRAC for both local and deployed models
  • Supports both OpenAI and AzureOpenAI clients

Models that make external calls are not considered repeatable, and will be flagged as not repeatable when they run on the TRAC platform.

This extension is a pre-release and will be finalized inTRAC 0.10.

Installing

The OpenAI extension can be installed with pip:

$ pip install tracdap-ext-openai

The package has the following dependencies:

  • tracdap-runtime (version 0.10.0-beta1 or later)
  • openai (version 1.x)

Using the OpenAI client

Here is a minimum working example of a TRAC model using the OpenAI client:

import tracdap.rt.api as trac
import openai

class OpenAIModel(trac.TracModel):

    # ... define parameters, inputs and outputs

    def define_resources(self):

        return {
            "openai": trac.define_external_system("openai", openai.OpenAI),
        }

    def run_model(self, ctx: trac.TracContext):

        with ctx.get_external_system("openai", openai.OpenAI) as client:

            response = client.responses.create(
                model="gpt-4o",
                instructions="You are a coding assistant that talks like a pirate.",
                input="How do I check if a Python object is an instance of a class?",
            )

            ctx.log.info(response.output_text)

if __name__ == '__main__':
    import tracdap.rt.launch as launch
    launch.launch_model(OpenAIModel, "config/job_config.yaml", "config/sys_config.yaml")

To make this example work, you will need to add openai as a resource in the system config file:

resources:

  openai:
    resourceType: EXTERNAL_SYSTEM
    protocol: openai

The client can be customized by setting additional properties on the resource, which are passed through to the OpenAI client.

resources:

  openai:
    resourceType: EXTERNAL_SYSTEM
    protocol: openai
    properties:
      project: proj_xxxxxxxxxxxxx

The following configuration properties are supported:

  • api_key, string, required
  • organization, string, optional
  • project, string, optional
  • base_url, string, default = https://api.openai.com/v1/
  • timeout, float, defeault = openai.DEFAULT_TIMEOUT.read (currently 600 seconds)
  • max_retries, int, default = openai.DEFAULT_MAX_RETRIES (currently 2)

The api_key should not be put into a config file in plain text, for local development it is recommended to set the OPENAI_API_KEY environment variable instead. If both the config property and the environment variable are set, the config property takes precedence.

Using the AzureOpenAI client

Here is a minimum working example of a TRAC model using the AzureOpenAI client. This assumes the required resources and deployments have been set up in Azure.

import tracdap.rt.api as trac
import openai

class TestModel(trac.TracModel):

    # ... define parameters, inputs and outputs

    def define_resources(self):

        return {
            "openai_azure": trac.define_external_system("openai", openai.AzureOpenAI)
        }

    def run_model(self, ctx: trac.TracContext):

        with ctx.get_external_system("openai_azure", openai.AzureOpenAI) as client:

            completion = client.chat.completions.create(
                model="gpt-4.1-mini",
                messages=[
                    { "role": "system", "content": "You are a coding assistant that talks like a pirate."},
                    { "role": "user", "content": "How do I check if a Python object is an instance of a class?" },
                ]
            )

            ctx.log.info(completion.choices[0].message.content)

if __name__ == '__main__':
    import tracdap.rt.launch as launch
    launch.launch_model(TestModel, "config/job_config.yaml", "config/sys_config.yaml")

To make this example work, you will need to add openai_azure as a resource in the system config file:

resources:

  openai_azure:
    resourceType: EXTERNAL_SYSTEM
    protocol: openai
    subProtocol: azure
    properties:
      api_version: 2025-04-01-preview
      azure_endpoint: https://my-azure-endpoint.cognitiveservices.azure.com/

Setting supProtcol: azure is required for to create an Azure client. The api_version and azure_endpoint properties must be specified, and model parameter in the client call must refer to live model deployment on that endpoint.

All the configuration properties supported by the regular client are also supported by the Azure client. Additionally, the Azure client supports these extra properties:

  • api_version, string, required
  • azure_endpoint, string, required
  • azure_deployment, string, optional
  • azure_ad_token, string, optional

For he Azure client, if api_key is not specified in the config file it is read from the environment variable AZURE_OPENAI_API_KEY. Similarly, azure_ad_token can be read from the environment variable AZURE_OPENAI_AD_TOKEN. If both the config property and the environment variable are set, the config property takes precedence.

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

tracdap_ext_openai-0.10.0.dev4.tar.gz (12.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

tracdap_ext_openai-0.10.0.dev4-py3-none-any.whl (11.4 kB view details)

Uploaded Python 3

File details

Details for the file tracdap_ext_openai-0.10.0.dev4.tar.gz.

File metadata

  • Download URL: tracdap_ext_openai-0.10.0.dev4.tar.gz
  • Upload date:
  • Size: 12.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for tracdap_ext_openai-0.10.0.dev4.tar.gz
Algorithm Hash digest
SHA256 65ae5cd0dc7eb3b9b6a82c8d56c995fe0aaf17365acd4b775a236e62e7300d38
MD5 2df031e17f45167919d38cf30e6c6563
BLAKE2b-256 53096999cf9da87713181e0fbc65849866d8c9521fb7c786985a07c55f2450c4

See more details on using hashes here.

Provenance

The following attestation bundles were made for tracdap_ext_openai-0.10.0.dev4.tar.gz:

Publisher: packaging.yaml on finos/tracdap

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file tracdap_ext_openai-0.10.0.dev4-py3-none-any.whl.

File metadata

File hashes

Hashes for tracdap_ext_openai-0.10.0.dev4-py3-none-any.whl
Algorithm Hash digest
SHA256 80ace68da31645a6c0daefb5c574e4124507f7a5e603b9c8d7fac64c50c9c933
MD5 154f66ac6d13fec662c1bd1bca600e2c
BLAKE2b-256 61460f5058c73c0ba0ac58e4c6fa905cb4369ec3fcc297c6458988ef9c6d016d

See more details on using hashes here.

Provenance

The following attestation bundles were made for tracdap_ext_openai-0.10.0.dev4-py3-none-any.whl:

Publisher: packaging.yaml on finos/tracdap

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page