Skip to main content

OpenAI provider for Metorial

Project description

metorial-openai

OpenAI provider integration for Metorial - enables using Metorial tools with OpenAI's language models through function calling.

Installation

pip install metorial-openai
# or
uv add metorial-openai
# or
poetry add metorial-openai

Features

  • 🤖 OpenAI Integration: Full support for GPT-4, GPT-3.5, and other OpenAI models
  • 🛠️ Function Calling: Native OpenAI function calling support
  • 📡 Session Management: Automatic tool lifecycle handling
  • 🔄 Format Conversion: Converts Metorial tools to OpenAI function format
  • Strict Mode: Optional strict parameter validation
  • Async Support: Full async/await support

Usage

Basic Usage

import asyncio
from openai import OpenAI
from metorial import Metorial
from metorial_openai import MetorialOpenAISession

async def main():
    # Initialize clients
    metorial = Metorial(api_key="your-metorial-api-key")
    openai_client = OpenAI(api_key="your-openai-api-key")
    
    # Create session with your server deployments
    async with metorial.session(["your-server-deployment-id"]) as session:
        # Create OpenAI-specific wrapper
        openai_session = MetorialOpenAISession(session.tool_manager)
        
        messages = [
            {"role": "user", "content": "What are the latest commits?"}
        ]
        
        response = openai_client.chat.completions.create(
            model="gpt-4",
            messages=messages,
            tools=openai_session.tools
        )
        
        # Handle tool calls
        tool_calls = response.choices[0].message.tool_calls
        if tool_calls:
            tool_responses = await openai_session.call_tools(tool_calls)
            
            # Add to conversation
            messages.append({
                "role": "assistant",
                "tool_calls": tool_calls
            })
            messages.extend(tool_responses)
            
            # Continue conversation...

asyncio.run(main())

Using Convenience Functions

from metorial_openai import build_openai_tools, call_openai_tools

async def example_with_functions():
    # Get tools in OpenAI format
    tools = build_openai_tools(tool_manager)
    
    # Call tools from OpenAI response
    tool_messages = await call_openai_tools(tool_manager, tool_calls)

API Reference

MetorialOpenAISession

Main session class for OpenAI integration.

session = MetorialOpenAISession(tool_manager)

Properties:

  • tools: List of tools in OpenAI function calling format

Methods:

  • async call_tools(tool_calls): Execute tool calls and return tool messages

build_openai_tools(tool_mgr)

Build OpenAI-compatible tool definitions.

Returns: List of tool definitions in OpenAI format

call_openai_tools(tool_mgr, tool_calls)

Execute tool calls from OpenAI response.

Returns: List of tool messages

Tool Format

Tools are converted to OpenAI's function calling format:

{
    "type": "function",
    "function": {
        "name": "tool_name",
        "description": "Tool description",
        "parameters": {
            "type": "object",
            "properties": {...},
            "required": [...]
        }
    }
}

Supported Models

All OpenAI models that support function calling:

  • gpt-4o: Latest GPT-4 Omni model
  • gpt-4o-mini: Smaller, faster GPT-4 Omni model
  • gpt-4-turbo: GPT-4 Turbo
  • gpt-4: Standard GPT-4
  • gpt-3.5-turbo: GPT-3.5 Turbo
  • And other function calling enabled models

Error Handling

try:
    tool_messages = await openai_session.call_tools(tool_calls)
except Exception as e:
    print(f"Tool execution failed: {e}")

Tool errors are returned as tool messages with error content.

Dependencies

  • openai>=1.0.0
  • metorial-mcp-session>=1.0.0
  • typing-extensions>=4.0.0

License

MIT License - see LICENSE file for details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

metorial_openai-1.0.0rc1.tar.gz (5.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

metorial_openai-1.0.0rc1-py3-none-any.whl (4.7 kB view details)

Uploaded Python 3

File details

Details for the file metorial_openai-1.0.0rc1.tar.gz.

File metadata

  • Download URL: metorial_openai-1.0.0rc1.tar.gz
  • Upload date:
  • Size: 5.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.12.9

File hashes

Hashes for metorial_openai-1.0.0rc1.tar.gz
Algorithm Hash digest
SHA256 dab8e6362114f38c312ee245c0019c3ad483ee67aec4ffec3d489d72ddcb6aa3
MD5 8400ad66d2ee9c7a2ca8abbb2c171931
BLAKE2b-256 f32dd91d9ebd0cf4b1a53e765522b6999876897256c8ee1f5bb45ccc8c366c2e

See more details on using hashes here.

Provenance

The following attestation bundles were made for metorial_openai-1.0.0rc1.tar.gz:

Publisher: release.yml on metorial/metorial-python

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file metorial_openai-1.0.0rc1-py3-none-any.whl.

File metadata

File hashes

Hashes for metorial_openai-1.0.0rc1-py3-none-any.whl
Algorithm Hash digest
SHA256 72d0d02352360fef9970b3b94570f9c9225669c7097d1daf3c7f111a7dd1c70b
MD5 e44ef341b65ff2797deeffbf1d5d382d
BLAKE2b-256 7a179c86a22ed0f3b51ea1e7a247977fc2512e2146ae31d6b7eff10f03f821fb

See more details on using hashes here.

Provenance

The following attestation bundles were made for metorial_openai-1.0.0rc1-py3-none-any.whl:

Publisher: release.yml on metorial/metorial-python

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page