Skip to main content

OpenAI provider for Metorial

Project description

metorial-openai

OpenAI provider integration for Metorial.

Installation

pip install metorial-openai
# or
uv add metorial-openai
# or
poetry add metorial-openai

Features

  • 🤖 OpenAI Integration: Full support for GPT-4, GPT-3.5, and other OpenAI models
  • 📡 Session Management: Automatic tool lifecycle handling
  • 🔄 Format Conversion: Converts Metorial tools to OpenAI function format
  • Async Support: Full async/await support

Supported Models

All OpenAI models that support function calling:

  • gpt-4o: Latest GPT-4 Omni model
  • gpt-4o-mini: Smaller, faster GPT-4 Omni model
  • gpt-4-turbo: GPT-4 Turbo
  • gpt-4: Standard GPT-4
  • gpt-3.5-turbo: GPT-3.5 Turbo
  • And other function calling enabled models

Usage

Quick Start (Recommended)

import asyncio
from openai import OpenAI
from metorial import Metorial

async def main():
  # Initialize clients
  metorial = Metorial(api_key="...your-metorial-api-key...") # async by default
  openai_client = OpenAI(api_key="...your-openai-api-key...")
  
  # One-liner chat with automatic session management
  response = await metorial.run(
    "What are the latest commits in the metorial/websocket-explorer repository?",
    "...your-mcp-server-deployment-id...", # can also be list
    openai_client,
    model="gpt-4o",
    max_iterations=25
  )
  
  print("Response:", response)

asyncio.run(main())

Streaming Chat

import asyncio
from openai import OpenAI
from metorial import Metorial
from metorial.types import StreamEventType

async def streaming_example():
  # Initialize clients
  metorial = Metorial(api_key="...your-metorial-api-key...")
  openai_client = OpenAI(api_key="...your-openai-api-key...")
  
  # Streaming chat with real-time responses
  async def stream_action(session):
    messages = [
      {"role": "user", "content": "Explain quantum computing"}
    ]
    
    async for event in metorial.stream(
      openai_client, session, messages, 
      model="gpt-4o",
      max_iterations=25
    ):
      if event.type == StreamEventType.CONTENT:
        print(f"🤖 {event.content}", end="", flush=True)
      elif event.type == StreamEventType.TOOL_CALL:
        print(f"\n🔧 Executing {len(event.tool_calls)} tool(s)...")
      elif event.type == StreamEventType.COMPLETE:
        print(f"\n✅ Complete!")
  
  await metorial.with_session("...your-server-deployment-id...", stream_action)

asyncio.run(streaming_example())

Advanced Usage with Session Management

import asyncio
from openai import OpenAI
from metorial import Metorial
from metorial_openai import MetorialOpenAISession

async def main():
  # Initialize clients
  metorial = Metorial(api_key="...your-metorial-api-key...")
  openai_client = OpenAI(api_key="...your-openai-api-key...")
  
  # Create session with your server deployments
  async with metorial.session(["...your-server-deployment-id..."]) as session:
    # Create OpenAI-specific wrapper
    openai_session = MetorialOpenAISession(session.tool_manager)
    
    messages = [
      {"role": "user", "content": "What are the latest commits?"}
    ]
    
    response = openai_client.chat.completions.create(
      model="gpt-4o",
      messages=messages,
      tools=openai_session.tools
    )
    
    # Handle tool calls
    tool_calls = response.choices[0].message.tool_calls
    if tool_calls:
      tool_responses = await openai_session.call_tools(tool_calls)
      
      # Add assistant message and tool responses
      messages.append(response.choices[0].message)
      messages.extend(tool_responses)
      
      # Continue conversation...

asyncio.run(main())

Using Convenience Functions

from metorial_openai import build_openai_tools, call_openai_tools

async def example_with_functions():
  # Get tools in OpenAI format
  tools = build_openai_tools(tool_manager)
  
  # Call tools from OpenAI response
  tool_messages = await call_openai_tools(tool_manager, tool_calls)

API Reference

MetorialOpenAISession

Main session class for OpenAI integration.

session = MetorialOpenAISession(tool_manager)

Properties:

  • tools: List of tools in OpenAI format

Methods:

  • async call_tools(tool_calls): Execute tool calls and return tool messages

build_openai_tools(tool_mgr)

Build OpenAI-compatible tool definitions.

Returns: List of tool definitions in OpenAI format

call_openai_tools(tool_mgr, tool_calls)

Execute tool calls from OpenAI response.

Returns: List of tool messages

Tool Format

Tools are converted to OpenAI's function calling format:

{
  "type": "function",
  "function": {
    "name": "tool_name",
    "description": "Tool description",
    "parameters": {
      "type": "object",
      "properties": {...},
      "required": [...]
    }
  }
}

Error Handling

try:
    tool_messages = await openai_session.call_tools(tool_calls)
except Exception as e:
    print(f"Tool execution failed: {e}")

Tool errors are returned as tool messages with error content.

License

MIT License - see LICENSE file for details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

metorial_openai-1.0.0rc5.tar.gz (6.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

metorial_openai-1.0.0rc5-py3-none-any.whl (5.0 kB view details)

Uploaded Python 3

File details

Details for the file metorial_openai-1.0.0rc5.tar.gz.

File metadata

  • Download URL: metorial_openai-1.0.0rc5.tar.gz
  • Upload date:
  • Size: 6.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for metorial_openai-1.0.0rc5.tar.gz
Algorithm Hash digest
SHA256 690982a4653de6de5e3358778dbf33ba28e975d45549428af39fc9a3b2505db6
MD5 95441a0259523c645088284fd3e18ad3
BLAKE2b-256 927cdf16833c8ac35d6150b9893e3b02637151ab0275a6eb768ac42c3c79555e

See more details on using hashes here.

Provenance

The following attestation bundles were made for metorial_openai-1.0.0rc5.tar.gz:

Publisher: release.yml on metorial/metorial-python

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file metorial_openai-1.0.0rc5-py3-none-any.whl.

File metadata

File hashes

Hashes for metorial_openai-1.0.0rc5-py3-none-any.whl
Algorithm Hash digest
SHA256 555a483d1b1d6916884e0d60fcd2ce312bbd3cc6ac24d967a8ab9df253ee7ffa
MD5 d8f60ab73bfeb394207edcec3307d9d2
BLAKE2b-256 fbaca9db7ac42844ff9c87437f55c4d233a992c4162a49999372cc58d11145d5

See more details on using hashes here.

Provenance

The following attestation bundles were made for metorial_openai-1.0.0rc5-py3-none-any.whl:

Publisher: release.yml on metorial/metorial-python

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page