Skip to main content

OpenAI-compatible provider base for Metorial

Project description

metorial-openai-compatible

OpenAI-compatible provider integration for Metorial. Works with any API that follows the OpenAI chat completions format.

Installation

pip install metorial openai

Quick Start

import asyncio
from metorial import Metorial, MetorialOpenAICompatible
from openai import AsyncOpenAI

metorial = Metorial(api_key="your-metorial-api-key")

# Works with any OpenAI-compatible API
client = AsyncOpenAI(
    api_key="your-api-key",
    base_url="https://your-api-endpoint.com/v1"
)

async def main():
    async def session_handler(session):
        messages = [{"role": "user", "content": "What's the latest news?"}]

        for _ in range(10):
            response = await client.chat.completions.create(
                model="your-model",
                messages=messages,
                tools=session["tools"]
            )

            choice = response.choices[0]
            tool_calls = choice.message.tool_calls

            if not tool_calls:
                print(choice.message.content)
                break

            tool_responses = await session["callTools"](tool_calls)
            messages.append({"role": "assistant", "tool_calls": tool_calls})
            messages.extend(tool_responses)

        await session["closeSession"]()

    await metorial.with_provider_session(
        MetorialOpenAICompatible.chat_completions,
        {"serverDeployments": [{"serverDeploymentId": "your-server-deployment-id"}]},
        session_handler
    )

asyncio.run(main())

Streaming

import asyncio
from metorial import Metorial, MetorialOpenAICompatible
from openai import AsyncOpenAI

metorial = Metorial(api_key="your-metorial-api-key")
client = AsyncOpenAI(
    api_key="your-api-key",
    base_url="https://your-api-endpoint.com/v1"
)

async def main():
    async def session_handler(session):
        messages = [{"role": "user", "content": "What's the latest news?"}]

        stream = await client.chat.completions.create(
            model="your-model",
            messages=messages,
            tools=session["tools"],
            stream=True
        )

        async for chunk in stream:
            if chunk.choices[0].delta.content:
                print(chunk.choices[0].delta.content, end="", flush=True)

        await session["closeSession"]()

    await metorial.with_provider_session(
        MetorialOpenAICompatible.chat_completions,
        {
            "serverDeployments": [{"serverDeploymentId": "your-server-deployment-id"}],
            "streaming": True,  # Required for streaming with tool calls
        },
        session_handler
    )

asyncio.run(main())

Compatible Providers

Any provider with an OpenAI-compatible API:

  • DeepSeek (https://api.deepseek.com)
  • Together AI (https://api.together.xyz/v1)
  • XAI (https://api.x.ai/v1)
  • Groq (https://api.groq.com/openai/v1)
  • And many more...

Session Object

async def session_handler(session):
    tools = session["tools"]           # Tool definitions in OpenAI format
    call_tools = session["callTools"]  # Execute tools and get responses
    close_session = session["closeSession"]  # Close the session when done

Error Handling

from metorial import MetorialAPIError

try:
    await metorial.with_provider_session(...)
except MetorialAPIError as e:
    print(f"API Error: {e.message} (Status: {e.status})")
except Exception as e:
    print(f"Unexpected error: {e}")

License

MIT License - see LICENSE file for details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

metorial_openai_compatible-1.0.5.tar.gz (6.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

metorial_openai_compatible-1.0.5-py3-none-any.whl (5.3 kB view details)

Uploaded Python 3

File details

Details for the file metorial_openai_compatible-1.0.5.tar.gz.

File metadata

File hashes

Hashes for metorial_openai_compatible-1.0.5.tar.gz
Algorithm Hash digest
SHA256 c9301e7ebfde8213baf5c167eb96bbfc46dc1bf0c46e0d8e433572404a043fdf
MD5 9c6af5ff8e1195d9548f26bcc1e8b4b8
BLAKE2b-256 4ae3182f8ddf052438a29d2c45021b8c26f57471a4a3598296bfdd6bc1231338

See more details on using hashes here.

Provenance

The following attestation bundles were made for metorial_openai_compatible-1.0.5.tar.gz:

Publisher: release.yml on metorial/metorial-python

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file metorial_openai_compatible-1.0.5-py3-none-any.whl.

File metadata

File hashes

Hashes for metorial_openai_compatible-1.0.5-py3-none-any.whl
Algorithm Hash digest
SHA256 2e09deb1222685bb1107054c4d96db1c96260f7c368d3209e5cb4b2e08f71e7c
MD5 19f6b0993de6fcbbd813742b1a19adfd
BLAKE2b-256 4de6496023bc0affe30ce20e9d38401c00e5dab31ee6bbaa3b9d0757f6c33272

See more details on using hashes here.

Provenance

The following attestation bundles were made for metorial_openai_compatible-1.0.5-py3-none-any.whl:

Publisher: release.yml on metorial/metorial-python

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page