Skip to main content

An extension for TRAC D.A.P. that lets models connect to OpenAI endpoints

Project description

tracdap

PyPI Version Python Versions Packaging status Compliance status FINOS - Incubating

OpenAI Extension for the TRAC Model Runtime

This extension makes the OpenAI Python SDK available to use from inside a TRAC model.

  • Use the native OpenAI client classes directly in TRAC model code
  • Connection settings managed by TRAC for both local and deployed models
  • Supports both OpenAI and AzureOpenAI clients

Models that make external calls are not considered repeatable, and will be flagged as not repeatable when they run on the TRAC platform.

This extension is a pre-release and will be finalized inTRAC 0.10.

Installing

The OpenAI extension can be installed with pip:

$ pip install tracdap-ext-openai

The package has the following dependencies:

  • tracdap-runtime (version 0.10.0-beta1 or later)
  • openai (version 1.x)

Using the OpenAI client

Here is a minimum working example of a TRAC model using the OpenAI client:

import tracdap.rt.api as trac
import openai

class OpenAIModel(trac.TracModel):

    # ... define parameters, inputs and outputs

    def define_resources(self):

        return {
            "openai": trac.define_external_system("openai", openai.OpenAI),
        }

    def run_model(self, ctx: trac.TracContext):

        with ctx.get_external_system("openai", openai.OpenAI) as client:

            response = client.responses.create(
                model="gpt-4o",
                instructions="You are a coding assistant that talks like a pirate.",
                input="How do I check if a Python object is an instance of a class?",
            )

            ctx.log.info(response.output_text)

if __name__ == '__main__':
    import tracdap.rt.launch as launch
    launch.launch_model(OpenAIModel, "config/job_config.yaml", "config/sys_config.yaml")

To make this example work, you will need to add openai as a resource in the system config file:

resources:

  openai:
    resourceType: EXTERNAL_SYSTEM
    protocol: openai

The client can be customized by setting additional properties on the resource, which are passed through to the OpenAI client.

resources:

  openai:
    resourceType: EXTERNAL_SYSTEM
    protocol: openai
    properties:
      project: proj_xxxxxxxxxxxxx

The following configuration properties are supported:

  • api_key, string, required
  • organization, string, optional
  • project, string, optional
  • base_url, string, default = https://api.openai.com/v1/
  • timeout, float, defeault = openai.DEFAULT_TIMEOUT.read (currently 600 seconds)
  • max_retries, int, default = openai.DEFAULT_MAX_RETRIES (currently 2)

The api_key should not be put into a config file in plain text, for local development it is recommended to set the OPENAI_API_KEY environment variable instead. If both the config property and the environment variable are set, the config property takes precedence.

Using the AzureOpenAI client

Here is a minimum working example of a TRAC model using the AzureOpenAI client. This assumes the required resources and deployments have been set up in Azure.

import tracdap.rt.api as trac
import openai

class TestModel(trac.TracModel):

    # ... define parameters, inputs and outputs

    def define_resources(self):

        return {
            "openai_azure": trac.define_external_system("openai", openai.AzureOpenAI)
        }

    def run_model(self, ctx: trac.TracContext):

        with ctx.get_external_system("openai_azure", openai.AzureOpenAI) as client:

            completion = client.chat.completions.create(
                model="gpt-4.1-mini",
                messages=[
                    { "role": "system", "content": "You are a coding assistant that talks like a pirate."},
                    { "role": "user", "content": "How do I check if a Python object is an instance of a class?" },
                ]
            )

            ctx.log.info(completion.choices[0].message.content)

if __name__ == '__main__':
    import tracdap.rt.launch as launch
    launch.launch_model(TestModel, "config/job_config.yaml", "config/sys_config.yaml")

To make this example work, you will need to add openai_azure as a resource in the system config file:

resources:

  openai_azure:
    resourceType: EXTERNAL_SYSTEM
    protocol: openai
    subProtocol: azure
    properties:
      api_version: 2025-04-01-preview
      azure_endpoint: https://my-azure-endpoint.cognitiveservices.azure.com/

Setting supProtcol: azure is required for to create an Azure client. The api_version and azure_endpoint properties must be specified, and model parameter in the client call must refer to live model deployment on that endpoint.

All the configuration properties supported by the regular client are also supported by the Azure client. Additionally, the Azure client supports these extra properties:

  • api_version, string, required
  • azure_endpoint, string, required
  • azure_deployment, string, optional
  • azure_ad_token, string, optional

For he Azure client, if api_key is not specified in the config file it is read from the environment variable AZURE_OPENAI_API_KEY. Similarly, azure_ad_token can be read from the environment variable AZURE_OPENAI_AD_TOKEN. If both the config property and the environment variable are set, the config property takes precedence.

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

tracdap_ext_openai-0.10.0.dev21.tar.gz (12.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

tracdap_ext_openai-0.10.0.dev21-py3-none-any.whl (11.4 kB view details)

Uploaded Python 3

File details

Details for the file tracdap_ext_openai-0.10.0.dev21.tar.gz.

File metadata

File hashes

Hashes for tracdap_ext_openai-0.10.0.dev21.tar.gz
Algorithm Hash digest
SHA256 1515a4876e85c3fcda7ba385c4f050bbcb21b8ceaeab6f810729d2ce6ad6b2d6
MD5 dfe359d4274753af4a3035606880cfe9
BLAKE2b-256 313553ea89bd4597db1aac15e03c40ada51e73cf7e7a4cc0e4fb519a809952bf

See more details on using hashes here.

Provenance

The following attestation bundles were made for tracdap_ext_openai-0.10.0.dev21.tar.gz:

Publisher: packaging.yaml on martin-traverse/tracdap

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file tracdap_ext_openai-0.10.0.dev21-py3-none-any.whl.

File metadata

File hashes

Hashes for tracdap_ext_openai-0.10.0.dev21-py3-none-any.whl
Algorithm Hash digest
SHA256 5ae18fb4d7b1ba56fb8804cb027dfca73daec034860bd9a833f5906f31a9076f
MD5 03e7067b56bbcb1b93bd3023fe07e312
BLAKE2b-256 0c5bfd557b4941842570d7bfd673f89e5e1aba9daa80b4cfe21180e757ebf9d1

See more details on using hashes here.

Provenance

The following attestation bundles were made for tracdap_ext_openai-0.10.0.dev21-py3-none-any.whl:

Publisher: packaging.yaml on martin-traverse/tracdap

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page