Python SDK for Metorial - The open source integration platform for agentic AI
Project description
Metorial Python SDK
The official Python SDK for Metorial.
Available Providers
| Provider | Format | Description |
|---|---|---|
| OpenAI | OpenAI function calling | GPT-4, GPT-3.5, etc. |
| Anthropic | Claude tool format | Claude 3.5, Claude 3, etc. |
| Gemini function declarations | Gemini Pro, Gemini Flash | |
| Mistral | Mistral function calling | Mistral Large, Codestral |
| DeepSeek | OpenAI-compatible | DeepSeek Chat, DeepSeek Coder |
| TogetherAI | OpenAI-compatible | Llama, Mixtral, etc. |
| XAI | OpenAI-compatible | Grok models |
| AI SDK | Framework tools | Vercel AI SDK, etc. |
Installation
# Install core metorial package (includes all provider adapters)
pip install metorial
# Install with specific providers (includes provider client libraries)
pip install metorial[openai,anthropic,google,mistral,deepseek,togetherai,xai]
Quick Start
Simple Usage
import asyncio
from metorial import Metorial
from openai import AsyncOpenAI
async def main():
metorial = Metorial(api_key="your-metorial-api-key")
openai = AsyncOpenAI(api_key="your-openai-api-key")
response = await metorial.run(
message="Search Hackernews for the latest AI discussions.",
server_deployments=["hacker-news-server-deployment"],
client=openai,
model="gpt-4o",
max_steps=25 # optional
)
print("Response:", response.text)
asyncio.run(main())
💡 Tip for Jupyter/Colab Users: If you're running in a Jupyter notebook or Google Colab, you can skip the
async def main():wrapper andasyncio.run()and just useawaitdirectly at the top level.
OAuth + Multiple Deployments
For integrations requiring OAuth authentication (like Google Calendar) and multiple server deployments:
import asyncio
import os
from metorial import Metorial
from anthropic import AsyncAnthropic
async def main():
metorial = Metorial(api_key=os.getenv("METORIAL_API_KEY"))
anthropic = AsyncAnthropic(api_key=os.getenv("ANTHROPIC_API_KEY"))
# Create OAuth session for authenticated services
google_cal_deployment_id = os.getenv("GOOGLE_CALENDAR_DEPLOYMENT_ID")
print("🔗 Creating OAuth session...")
oauth_session = metorial.oauth.sessions.create(
server_deployment_id=google_cal_deployment_id
)
print("OAuth URLs for user authentication:")
print(f" Google Calendar: {oauth_session.url}")
print("\n⏳ Waiting for OAuth completion...")
await metorial.oauth.wait_for_completion([oauth_session])
print("✅ OAuth session completed!")
# Use multiple server deployments with mixed auth
hackernews_deployment_id = os.getenv("HACKERNEWS_DEPLOYMENT_ID")
result = await metorial.run(
message="""Search Hackernews for the latest AI discussions using the available tools.
Then create a calendar event using Google Calendar tools with my@email.address for tomorrow at 2pm to discuss AI trends.""",
server_deployments=[
{ "serverDeploymentId": google_cal_deployment_id, "oauthSessionId": oauth_session.id },
{ "serverDeploymentId": hackernews_deployment_id },
],
client=anthropic,
model="claude-sonnet-4-20250514",
max_tokens=4096,
max_steps=25,
)
print(result.text)
asyncio.run(main())
That's it! metorial.run() automatically:
- Creates a session with your MCP server
- Formats tools for your AI provider
- Handles the execution loop
- Manages tool execution
- Returns the final response
Advanced Usage with Provider Sessions
For more control over the conversation flow, you can use with_provider_session:
import asyncio
from metorial import Metorial, MetorialOpenAI
from openai import AsyncOpenAI
async def main():
metorial = Metorial(api_key="your-metorial-api-key")
openai = AsyncOpenAI(api_key="your-openai-api-key")
messages = [
{"role": "user", "content": "What are the top hackernews posts?"}
]
async def session_action(session):
for i in range(10):
response = await openai.chat.completions.create(
messages=messages,
model="gpt-4o",
tools=session["tools"]
)
choice = response.choices[0]
tool_calls = choice.message.tool_calls
if not tool_calls:
print(choice.message.content)
return
# Execute tools through Metorial
tool_responses = await session["callTools"](tool_calls)
# Add to conversation
messages.append({
"role": "assistant",
"tool_calls": choice.message.tool_calls
})
messages.extend(tool_responses)
await metorial.with_provider_session(
MetorialOpenAI.chat_completions,
[{"serverDeploymentId": "your-deployment-id"}],
session_action
)
asyncio.run(main())
This approach gives you full control over the conversation loop while still benefiting from Metorial's tool management.
Provider Examples
Metorial works with all major AI providers. Here are examples using metorial.run():
Example OpenAI (GPT-4, GPT-3.5)
from metorial import Metorial
from openai import AsyncOpenAI
metorial = Metorial(api_key="your-metorial-api-key")
openai = AsyncOpenAI(api_key="your-openai-api-key")
response = await metorial.run(
message="What are the latest commits?",
server_deployments=["your-deployment-id"],
client=openai,
model="gpt-4o"
)
Anthropic (Claude)
from metorial import Metorial
import anthropic
metorial = Metorial(api_key="your-metorial-api-key")
anthropic = anthropic.AsyncAnthropic(api_key="your-anthropic-api-key")
response = await metorial.run(
message="What are the latest commits?",
server_deployments=["your-deployment-id"],
client=anthropic,
model="claude-3-5-sonnet-20241022"
)
Google (Gemini)
from metorial import Metorial
import google.generativeai as genai
metorial = Metorial(api_key="your-metorial-api-key")
genai.configure(api_key="your-google-api-key")
google = genai.GenerativeModel('gemini-pro')
response = await metorial.run(
message="What are the latest commits?",
server_deployments=["your-deployment-id"],
client=google,
model="gemini-pro"
)
Mistral AI
from metorial import Metorial
from mistralai import AsyncMistral
metorial = Metorial(api_key="your-metorial-api-key")
mistral = AsyncMistral(api_key="your-mistral-api-key")
response = await metorial.run(
message="What are the latest commits?",
server_deployments=["your-deployment-id"],
client=mistral,
model="mistral-large-latest"
)
DeepSeek
from metorial import Metorial
from openai import AsyncOpenAI
metorial = Metorial(api_key="your-metorial-api-key")
deepseek = AsyncOpenAI(
api_key="your-deepseek-api-key",
base_url="https://api.deepseek.com"
)
response = await metorial.run(
message="What are the latest commits?",
server_deployments=["your-deployment-id"],
client=deepseek,
model="deepseek-chat"
)
Together AI
from metorial import Metorial
from openai import AsyncOpenAI
metorial = Metorial(api_key="your-metorial-api-key")
together = AsyncOpenAI(
api_key="your-together-api-key",
base_url="https://api.together.xyz/v1"
)
response = await metorial.run(
message="What are the latest commits?",
server_deployments=["your-deployment-id"],
client=together,
model="meta-llama/Llama-2-70b-chat-hf"
)
XAI (Grok)
from metorial import Metorial
from openai import AsyncOpenAI
metorial = Metorial(api_key="your-metorial-api-key")
xai = AsyncOpenAI(
api_key="your-xai-api-key",
base_url="https://api.x.ai/v1"
)
response = await metorial.run(
message="What are the latest commits?",
server_deployments=["your-deployment-id"],
client=xai,
model="grok-beta"
)
Error Handling
from metorial import MetorialAPIError
try:
response = await metorial.run(
message="What are the latest commits?",
server_deployments=["your-deployment-id"],
client=openai,
model="gpt-4o"
)
except MetorialAPIError as e:
print(f"API Error: {e.message} (Status: {e.status_code})")
except Exception as e:
print(f"Unexpected error: {e}")
Examples
Check out the examples/ directory for more comprehensive examples.
License
MIT License - see LICENSE file for details.
Support
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file metorial-1.0.9.tar.gz.
File metadata
- Download URL: metorial-1.0.9.tar.gz
- Upload date:
- Size: 9.4 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
e7ff850f36e0391c6833deded480d1ea997450ee32f036bd55369962caf32557
|
|
| MD5 |
094357f52a26b4c6c6554939d46dd315
|
|
| BLAKE2b-256 |
ef3a25a9452a8a91dfbdc6c07ba9963c681739353fc842d6d9c32290515fbf56
|
Provenance
The following attestation bundles were made for metorial-1.0.9.tar.gz:
Publisher:
release.yml on metorial/metorial-python
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
metorial-1.0.9.tar.gz -
Subject digest:
e7ff850f36e0391c6833deded480d1ea997450ee32f036bd55369962caf32557 - Sigstore transparency entry: 653909961
- Sigstore integration time:
-
Permalink:
metorial/metorial-python@cf4f84c42399f1196df84bebf55aa7bbb9f4dd40 -
Branch / Tag:
refs/heads/main - Owner: https://github.com/metorial
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
release.yml@cf4f84c42399f1196df84bebf55aa7bbb9f4dd40 -
Trigger Event:
workflow_dispatch
-
Statement type:
File details
Details for the file metorial-1.0.9-py3-none-any.whl.
File metadata
- Download URL: metorial-1.0.9-py3-none-any.whl
- Upload date:
- Size: 8.0 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
f067579bad026b2fc482130f22a68f5d3600d58877aa90fd7517bac93a398dc0
|
|
| MD5 |
bdbb28cce83b652320b6f3f29b920f60
|
|
| BLAKE2b-256 |
1039786d3c78b29acf450fee16a58dc495b494c49c12117a73fa8ec2269ffa8b
|
Provenance
The following attestation bundles were made for metorial-1.0.9-py3-none-any.whl:
Publisher:
release.yml on metorial/metorial-python
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
metorial-1.0.9-py3-none-any.whl -
Subject digest:
f067579bad026b2fc482130f22a68f5d3600d58877aa90fd7517bac93a398dc0 - Sigstore transparency entry: 653909965
- Sigstore integration time:
-
Permalink:
metorial/metorial-python@cf4f84c42399f1196df84bebf55aa7bbb9f4dd40 -
Branch / Tag:
refs/heads/main - Owner: https://github.com/metorial
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
release.yml@cf4f84c42399f1196df84bebf55aa7bbb9f4dd40 -
Trigger Event:
workflow_dispatch
-
Statement type: