Skip to main content

OpenAI provider for Metorial

Project description

metorial-openai

OpenAI provider integration for Metorial.

Installation

pip install metorial openai

Quick Start

import asyncio
from metorial import Metorial, MetorialOpenAI
from openai import AsyncOpenAI

metorial = Metorial(api_key="your-metorial-api-key")
openai = AsyncOpenAI(api_key="your-openai-api-key")

async def main():
    async def session_handler(session):
        messages = [{"role": "user", "content": "What's the latest news?"}]

        for _ in range(10):
            response = await openai.chat.completions.create(
                model="gpt-4o",
                messages=messages,
                tools=session["tools"]
            )

            choice = response.choices[0]
            tool_calls = choice.message.tool_calls

            if not tool_calls:
                print(choice.message.content)
                break

            tool_responses = await session["callTools"](tool_calls)
            messages.append({"role": "assistant", "tool_calls": tool_calls})
            messages.extend(tool_responses)

        await session["closeSession"]()

    await metorial.with_provider_session(
        MetorialOpenAI.chat_completions,
        {"serverDeployments": [{"serverDeploymentId": "your-server-deployment-id"}]},
        session_handler
    )

asyncio.run(main())

Streaming

import asyncio
from metorial import Metorial, MetorialOpenAI
from openai import AsyncOpenAI

metorial = Metorial(api_key="your-metorial-api-key")
openai = AsyncOpenAI(api_key="your-openai-api-key")

async def main():
    async def session_handler(session):
        messages = [{"role": "user", "content": "What's the latest news?"}]

        stream = await openai.chat.completions.create(
            model="gpt-4o",
            messages=messages,
            tools=session["tools"],
            stream=True
        )

        async for chunk in stream:
            if chunk.choices[0].delta.content:
                print(chunk.choices[0].delta.content, end="", flush=True)

        await session["closeSession"]()

    await metorial.with_provider_session(
        MetorialOpenAI.chat_completions,
        {
            "serverDeployments": [{"serverDeploymentId": "your-server-deployment-id"}],
            "streaming": True,  # Required for streaming with tool calls
        },
        session_handler
    )

asyncio.run(main())

Supported Models

All OpenAI models that support function calling:

  • gpt-4o: Latest GPT-4o
  • gpt-4.1: GPT-4.1
  • o1: OpenAI o1
  • o3: OpenAI o3
  • gpt-4-turbo: GPT-4 Turbo
  • gpt-3.5-turbo: GPT-3.5 Turbo

Session Object

async def session_handler(session):
    tools = session["tools"]           # Tool definitions in OpenAI format
    call_tools = session["callTools"]  # Execute tools and get responses
    close_session = session["closeSession"]  # Close the session when done

Error Handling

from metorial import MetorialAPIError

try:
    await metorial.with_provider_session(...)
except MetorialAPIError as e:
    print(f"API Error: {e.message} (Status: {e.status})")
except Exception as e:
    print(f"Unexpected error: {e}")

License

MIT License - see LICENSE file for details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

metorial_openai-1.0.6.tar.gz (5.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

metorial_openai-1.0.6-py3-none-any.whl (4.8 kB view details)

Uploaded Python 3

File details

Details for the file metorial_openai-1.0.6.tar.gz.

File metadata

  • Download URL: metorial_openai-1.0.6.tar.gz
  • Upload date:
  • Size: 5.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for metorial_openai-1.0.6.tar.gz
Algorithm Hash digest
SHA256 c65c4f8a19db9b8054c52904c542889d634d49b06460052495a3fc8b11bcec49
MD5 53fa8c92e74f917af302bd2cc68fe026
BLAKE2b-256 e3ecdf8201fe7db495e971067974cd5814c496d1f1c271adc4a0b8f510301dbf

See more details on using hashes here.

Provenance

The following attestation bundles were made for metorial_openai-1.0.6.tar.gz:

Publisher: release.yml on metorial/metorial-python

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file metorial_openai-1.0.6-py3-none-any.whl.

File metadata

File hashes

Hashes for metorial_openai-1.0.6-py3-none-any.whl
Algorithm Hash digest
SHA256 4f9d43bf4d8a3dccbf150003b70cd96b799f4fee1d079c194d4d929dde74f9f4
MD5 5be1a2348488336ef09bfd8dc2a348cd
BLAKE2b-256 6c1149ba921fd4aa71d33e8d7319e67e1e45a7e3302b74baaa9b518d5e2c6b93

See more details on using hashes here.

Provenance

The following attestation bundles were made for metorial_openai-1.0.6-py3-none-any.whl:

Publisher: release.yml on metorial/metorial-python

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page