OpenAI provider for Metorial
Project description
metorial-openai
OpenAI provider integration for Metorial.
Installation
pip install metorial openai
Quick Start
import asyncio
from metorial import Metorial, MetorialOpenAI
from openai import AsyncOpenAI
metorial = Metorial(api_key="your-metorial-api-key")
openai = AsyncOpenAI(api_key="your-openai-api-key")
async def main():
async def session_handler(session):
messages = [{"role": "user", "content": "What's the latest news?"}]
for _ in range(10):
response = await openai.chat.completions.create(
model="gpt-4o",
messages=messages,
tools=session["tools"]
)
choice = response.choices[0]
tool_calls = choice.message.tool_calls
if not tool_calls:
print(choice.message.content)
break
tool_responses = await session["callTools"](tool_calls)
messages.append({"role": "assistant", "tool_calls": tool_calls})
messages.extend(tool_responses)
await session["closeSession"]()
await metorial.with_provider_session(
MetorialOpenAI.chat_completions,
{"serverDeployments": [{"serverDeploymentId": "your-server-deployment-id"}]},
session_handler
)
asyncio.run(main())
Streaming
import asyncio
from metorial import Metorial, MetorialOpenAI
from openai import AsyncOpenAI
metorial = Metorial(api_key="your-metorial-api-key")
openai = AsyncOpenAI(api_key="your-openai-api-key")
async def main():
async def session_handler(session):
messages = [{"role": "user", "content": "What's the latest news?"}]
stream = await openai.chat.completions.create(
model="gpt-4o",
messages=messages,
tools=session["tools"],
stream=True
)
async for chunk in stream:
if chunk.choices[0].delta.content:
print(chunk.choices[0].delta.content, end="", flush=True)
await session["closeSession"]()
await metorial.with_provider_session(
MetorialOpenAI.chat_completions,
{
"serverDeployments": [{"serverDeploymentId": "your-server-deployment-id"}],
"streaming": True, # Required for streaming with tool calls
},
session_handler
)
asyncio.run(main())
Supported Models
All OpenAI models that support function calling:
gpt-4o: Latest GPT-4ogpt-4.1: GPT-4.1o1: OpenAI o1o3: OpenAI o3gpt-4-turbo: GPT-4 Turbogpt-3.5-turbo: GPT-3.5 Turbo
Session Object
async def session_handler(session):
tools = session["tools"] # Tool definitions in OpenAI format
call_tools = session["callTools"] # Execute tools and get responses
close_session = session["closeSession"] # Close the session when done
Error Handling
from metorial import MetorialAPIError
try:
await metorial.with_provider_session(...)
except MetorialAPIError as e:
print(f"API Error: {e.message} (Status: {e.status})")
except Exception as e:
print(f"Unexpected error: {e}")
License
MIT License - see LICENSE file for details.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file metorial_openai-1.0.6.tar.gz.
File metadata
- Download URL: metorial_openai-1.0.6.tar.gz
- Upload date:
- Size: 5.9 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
c65c4f8a19db9b8054c52904c542889d634d49b06460052495a3fc8b11bcec49
|
|
| MD5 |
53fa8c92e74f917af302bd2cc68fe026
|
|
| BLAKE2b-256 |
e3ecdf8201fe7db495e971067974cd5814c496d1f1c271adc4a0b8f510301dbf
|
Provenance
The following attestation bundles were made for metorial_openai-1.0.6.tar.gz:
Publisher:
release.yml on metorial/metorial-python
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
metorial_openai-1.0.6.tar.gz -
Subject digest:
c65c4f8a19db9b8054c52904c542889d634d49b06460052495a3fc8b11bcec49 - Sigstore transparency entry: 776931481
- Sigstore integration time:
-
Permalink:
metorial/metorial-python@ed129f7a01110a9ea3729266af09ac6326785b5e -
Branch / Tag:
refs/heads/main - Owner: https://github.com/metorial
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
release.yml@ed129f7a01110a9ea3729266af09ac6326785b5e -
Trigger Event:
workflow_dispatch
-
Statement type:
File details
Details for the file metorial_openai-1.0.6-py3-none-any.whl.
File metadata
- Download URL: metorial_openai-1.0.6-py3-none-any.whl
- Upload date:
- Size: 4.8 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
4f9d43bf4d8a3dccbf150003b70cd96b799f4fee1d079c194d4d929dde74f9f4
|
|
| MD5 |
5be1a2348488336ef09bfd8dc2a348cd
|
|
| BLAKE2b-256 |
6c1149ba921fd4aa71d33e8d7319e67e1e45a7e3302b74baaa9b518d5e2c6b93
|
Provenance
The following attestation bundles were made for metorial_openai-1.0.6-py3-none-any.whl:
Publisher:
release.yml on metorial/metorial-python
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
metorial_openai-1.0.6-py3-none-any.whl -
Subject digest:
4f9d43bf4d8a3dccbf150003b70cd96b799f4fee1d079c194d4d929dde74f9f4 - Sigstore transparency entry: 776931491
- Sigstore integration time:
-
Permalink:
metorial/metorial-python@ed129f7a01110a9ea3729266af09ac6326785b5e -
Branch / Tag:
refs/heads/main - Owner: https://github.com/metorial
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
release.yml@ed129f7a01110a9ea3729266af09ac6326785b5e -
Trigger Event:
workflow_dispatch
-
Statement type: