OpenAI provider for Metorial
Project description
metorial-openai
OpenAI provider integration for Metorial.
Installation
pip install metorial-openai
# or
uv add metorial-openai
# or
poetry add metorial-openai
Features
- 🤖 OpenAI Integration: Full support for GPT-4, GPT-3.5, and other OpenAI models
- 📡 Session Management: Automatic tool lifecycle handling
- 🔄 Format Conversion: Converts Metorial tools to OpenAI function format
- ⚡ Async Support: Full async/await support
Supported Models
All OpenAI models that support function calling:
gpt-4o: Latest GPT-4 Omni modelgpt-4o-mini: Smaller, faster GPT-4 Omni modelgpt-4-turbo: GPT-4 Turbogpt-4: Standard GPT-4gpt-3.5-turbo: GPT-3.5 Turbo- And other function calling enabled models
Usage
Quick Start (Recommended)
import asyncio
from openai import OpenAI
from metorial import Metorial
async def main():
# Initialize clients
metorial = Metorial(api_key="...your-metorial-api-key...") # async by default
openai_client = OpenAI(api_key="...your-openai-api-key...")
# One-liner chat with automatic session management
response = await metorial.run(
"What are the latest commits in the metorial/websocket-explorer repository?",
"...your-mcp-server-deployment-id...", # can also be list
openai_client,
model="gpt-4o",
max_iterations=25
)
print("Response:", response)
asyncio.run(main())
Streaming Chat
import asyncio
from openai import OpenAI
from metorial import Metorial
from metorial.types import StreamEventType
async def streaming_example():
# Initialize clients
metorial = Metorial(api_key="...your-metorial-api-key...")
openai_client = OpenAI(api_key="...your-openai-api-key...")
# Streaming chat with real-time responses
async def stream_action(session):
messages = [
{"role": "user", "content": "Explain quantum computing"}
]
async for event in metorial.stream(
openai_client, session, messages,
model="gpt-4o",
max_iterations=25
):
if event.type == StreamEventType.CONTENT:
print(f"🤖 {event.content}", end="", flush=True)
elif event.type == StreamEventType.TOOL_CALL:
print(f"\n🔧 Executing {len(event.tool_calls)} tool(s)...")
elif event.type == StreamEventType.COMPLETE:
print(f"\n✅ Complete!")
await metorial.with_session("...your-server-deployment-id...", stream_action)
asyncio.run(streaming_example())
Advanced Usage with Session Management
import asyncio
from openai import OpenAI
from metorial import Metorial
from metorial_openai import MetorialOpenAISession
async def main():
# Initialize clients
metorial = Metorial(api_key="...your-metorial-api-key...")
openai_client = OpenAI(api_key="...your-openai-api-key...")
# Create session with your server deployments
async with metorial.session(["...your-server-deployment-id..."]) as session:
# Create OpenAI-specific wrapper
openai_session = MetorialOpenAISession(session.tool_manager)
messages = [
{"role": "user", "content": "What are the latest commits?"}
]
response = openai_client.chat.completions.create(
model="gpt-4o",
messages=messages,
tools=openai_session.tools
)
# Handle tool calls
tool_calls = response.choices[0].message.tool_calls
if tool_calls:
tool_responses = await openai_session.call_tools(tool_calls)
# Add assistant message and tool responses
messages.append(response.choices[0].message)
messages.extend(tool_responses)
# Continue conversation...
asyncio.run(main())
Using Convenience Functions
from metorial_openai import build_openai_tools, call_openai_tools
async def example_with_functions():
# Get tools in OpenAI format
tools = build_openai_tools(tool_manager)
# Call tools from OpenAI response
tool_messages = await call_openai_tools(tool_manager, tool_calls)
API Reference
MetorialOpenAISession
Main session class for OpenAI integration.
session = MetorialOpenAISession(tool_manager)
Properties:
tools: List of tools in OpenAI format
Methods:
async call_tools(tool_calls): Execute tool calls and return tool messages
build_openai_tools(tool_mgr)
Build OpenAI-compatible tool definitions.
Returns: List of tool definitions in OpenAI format
call_openai_tools(tool_mgr, tool_calls)
Execute tool calls from OpenAI response.
Returns: List of tool messages
Tool Format
Tools are converted to OpenAI's function calling format:
{
"type": "function",
"function": {
"name": "tool_name",
"description": "Tool description",
"parameters": {
"type": "object",
"properties": {...},
"required": [...]
}
}
}
Error Handling
try:
tool_messages = await openai_session.call_tools(tool_calls)
except Exception as e:
print(f"Tool execution failed: {e}")
Tool errors are returned as tool messages with error content.
License
MIT License - see LICENSE file for details.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file metorial_openai-1.0.0rc6.tar.gz.
File metadata
- Download URL: metorial_openai-1.0.0rc6.tar.gz
- Upload date:
- Size: 6.1 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
1105d1cf596555e46d6ab9fdc47bc7a4cd373f939277e1dcbf818fdac57c374c
|
|
| MD5 |
c191a591fab5b29bc2ebb092e5178882
|
|
| BLAKE2b-256 |
7444005b46215e00a0542cc1cb0232fda1afab1c6533bbca4907c3c75351fee8
|
Provenance
The following attestation bundles were made for metorial_openai-1.0.0rc6.tar.gz:
Publisher:
release.yml on metorial/metorial-python
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
metorial_openai-1.0.0rc6.tar.gz -
Subject digest:
1105d1cf596555e46d6ab9fdc47bc7a4cd373f939277e1dcbf818fdac57c374c - Sigstore transparency entry: 543761150
- Sigstore integration time:
-
Permalink:
metorial/metorial-python@1664713bfab78ae686cc7699d61f3b52f9559b11 -
Branch / Tag:
refs/heads/main - Owner: https://github.com/metorial
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
release.yml@1664713bfab78ae686cc7699d61f3b52f9559b11 -
Trigger Event:
workflow_dispatch
-
Statement type:
File details
Details for the file metorial_openai-1.0.0rc6-py3-none-any.whl.
File metadata
- Download URL: metorial_openai-1.0.0rc6-py3-none-any.whl
- Upload date:
- Size: 5.0 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
7c47d48d94c970d538e3ad9687478c6a2182bba0fcb612d78af9525c42dc307c
|
|
| MD5 |
2f51d7c379f2ceeccfa8c7762318618b
|
|
| BLAKE2b-256 |
91c511add83166fbd1df0cc352268f2ccffd0bfef2eb6782b9dec90396beb516
|
Provenance
The following attestation bundles were made for metorial_openai-1.0.0rc6-py3-none-any.whl:
Publisher:
release.yml on metorial/metorial-python
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
metorial_openai-1.0.0rc6-py3-none-any.whl -
Subject digest:
7c47d48d94c970d538e3ad9687478c6a2182bba0fcb612d78af9525c42dc307c - Sigstore transparency entry: 543761155
- Sigstore integration time:
-
Permalink:
metorial/metorial-python@1664713bfab78ae686cc7699d61f3b52f9559b11 -
Branch / Tag:
refs/heads/main - Owner: https://github.com/metorial
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
release.yml@1664713bfab78ae686cc7699d61f3b52f9559b11 -
Trigger Event:
workflow_dispatch
-
Statement type: