Skip to main content

An extension for TRAC D.A.P. that lets models connect to OpenAI endpoints

Project description

tracdap

PyPI Version Python Versions Packaging status Compliance status FINOS - Incubating

OpenAI Extension for the TRAC Model Runtime

This extension makes the OpenAI Python SDK available to use from inside a TRAC model.

  • Use the native OpenAI client classes directly in TRAC model code
  • Connection settings managed by TRAC for both local and deployed models
  • Supports both OpenAI and AzureOpenAI clients

Models that make external calls are not considered repeatable, and will be flagged as not repeatable when they run on the TRAC platform.

This extension is a pre-release and will be finalized inTRAC 0.10.

Installing

The OpenAI extension can be installed with pip:

$ pip install tracdap-ext-openai

The package has the following dependencies:

  • tracdap-runtime (version 0.10.0-beta1 or later)
  • openai (version 1.x)

Using the OpenAI client

Here is a minimum working example of a TRAC model using the OpenAI client:

import tracdap.rt.api as trac
import openai

class OpenAIModel(trac.TracModel):

    # ... define parameters, inputs and outputs

    def define_resources(self):

        return {
            "openai": trac.define_external_system("openai", openai.OpenAI),
        }

    def run_model(self, ctx: trac.TracContext):

        with ctx.get_external_system("openai", openai.OpenAI) as client:

            response = client.responses.create(
                model="gpt-4o",
                instructions="You are a coding assistant that talks like a pirate.",
                input="How do I check if a Python object is an instance of a class?",
            )

            ctx.log.info(response.output_text)

if __name__ == '__main__':
    import tracdap.rt.launch as launch
    launch.launch_model(OpenAIModel, "config/job_config.yaml", "config/sys_config.yaml")

To make this example work, you will need to add openai as a resource in the system config file:

resources:

  openai:
    resourceType: EXTERNAL_SYSTEM
    protocol: openai

The client can be customized by setting additional properties on the resource, which are passed through to the OpenAI client.

resources:

  openai:
    resourceType: EXTERNAL_SYSTEM
    protocol: openai
    properties:
      project: proj_xxxxxxxxxxxxx

The following configuration properties are supported:

  • api_key, string, required
  • organization, string, optional
  • project, string, optional
  • base_url, string, default = https://api.openai.com/v1/
  • timeout, float, defeault = openai.DEFAULT_TIMEOUT.read (currently 600 seconds)
  • max_retries, int, default = openai.DEFAULT_MAX_RETRIES (currently 2)

The api_key should not be put into a config file in plain text, for local development it is recommended to set the OPENAI_API_KEY environment variable instead. If both the config property and the environment variable are set, the config property takes precedence.

Using the AzureOpenAI client

Here is a minimum working example of a TRAC model using the AzureOpenAI client. This assumes the required resources and deployments have been set up in Azure.

import tracdap.rt.api as trac
import openai

class TestModel(trac.TracModel):

    # ... define parameters, inputs and outputs

    def define_resources(self):

        return {
            "openai_azure": trac.define_external_system("openai", openai.AzureOpenAI)
        }

    def run_model(self, ctx: trac.TracContext):

        with ctx.get_external_system("openai_azure", openai.AzureOpenAI) as client:

            completion = client.chat.completions.create(
                model="gpt-4.1-mini",
                messages=[
                    { "role": "system", "content": "You are a coding assistant that talks like a pirate."},
                    { "role": "user", "content": "How do I check if a Python object is an instance of a class?" },
                ]
            )

            ctx.log.info(completion.choices[0].message.content)

if __name__ == '__main__':
    import tracdap.rt.launch as launch
    launch.launch_model(TestModel, "config/job_config.yaml", "config/sys_config.yaml")

To make this example work, you will need to add openai_azure as a resource in the system config file:

resources:

  openai_azure:
    resourceType: EXTERNAL_SYSTEM
    protocol: openai
    subProtocol: azure
    properties:
      api_version: 2025-04-01-preview
      azure_endpoint: https://my-azure-endpoint.cognitiveservices.azure.com/

Setting supProtcol: azure is required for to create an Azure client. The api_version and azure_endpoint properties must be specified, and model parameter in the client call must refer to live model deployment on that endpoint.

All the configuration properties supported by the regular client are also supported by the Azure client. Additionally, the Azure client supports these extra properties:

  • api_version, string, required
  • azure_endpoint, string, required
  • azure_deployment, string, optional
  • azure_ad_token, string, optional

For he Azure client, if api_key is not specified in the config file it is read from the environment variable AZURE_OPENAI_API_KEY. Similarly, azure_ad_token can be read from the environment variable AZURE_OPENAI_AD_TOKEN. If both the config property and the environment variable are set, the config property takes precedence.

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

tracdap_ext_openai-0.10.0.dev5.tar.gz (12.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

tracdap_ext_openai-0.10.0.dev5-py3-none-any.whl (11.4 kB view details)

Uploaded Python 3

File details

Details for the file tracdap_ext_openai-0.10.0.dev5.tar.gz.

File metadata

  • Download URL: tracdap_ext_openai-0.10.0.dev5.tar.gz
  • Upload date:
  • Size: 12.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for tracdap_ext_openai-0.10.0.dev5.tar.gz
Algorithm Hash digest
SHA256 2ee99b2657ef264a041b9bac6d3a30f68c7fc58e2dc194f41bd6732ba1723331
MD5 a5e2d76dff52b879514616276f269eb3
BLAKE2b-256 4164680f94e55d650988362810143655578f1b30a2471c850ab0aef5dcddd2cd

See more details on using hashes here.

Provenance

The following attestation bundles were made for tracdap_ext_openai-0.10.0.dev5.tar.gz:

Publisher: packaging.yaml on finos/tracdap

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file tracdap_ext_openai-0.10.0.dev5-py3-none-any.whl.

File metadata

File hashes

Hashes for tracdap_ext_openai-0.10.0.dev5-py3-none-any.whl
Algorithm Hash digest
SHA256 e0905b12d40b7332902dd0ea959c10d4c83cc47cfd228ea87b84a791cd28aa7e
MD5 4f2d1859d8d332c429628333f7eac953
BLAKE2b-256 151fc21ee4cfde80d1149870adc5238f1fdfe633ab00b2b7f518a42ccd8b99e8

See more details on using hashes here.

Provenance

The following attestation bundles were made for tracdap_ext_openai-0.10.0.dev5-py3-none-any.whl:

Publisher: packaging.yaml on finos/tracdap

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page