Together AI provider for Metorial
Project description
metorial-togetherai
Together AI provider integration for Metorial.
Installation
pip install metorial-togetherai
# or
uv add metorial-togetherai
# or
poetry add metorial-togetherai
Features
- 🤖 Together AI Integration: Full support for Llama, Mixtral, and other Together AI models
- 🛠️ Function Calling: OpenAI-compatible function calling support
- 📡 Session Management: Automatic tool lifecycle handling
- ⚡ Async Support: Full async/await support
Supported Models
Popular models available through Together AI:
meta-llama/Meta-Llama-3.1-70B-Instruct-Turbo: Llama 3.1 70Bmeta-llama/Meta-Llama-3.1-8B-Instruct-Turbo: Llama 3.1 8Bmistralai/Mixtral-8x7B-Instruct-v0.1: Mixtral 8x7BNousResearch/Nous-Hermes-2-Mixtral-8x7B-DPO: Nous Hermes 2- And many more...
Usage
Quick Start (Recommended)
import asyncio
from openai import AsyncOpenAI
from metorial import Metorial
async def main():
# Initialize clients
metorial = Metorial(api_key="...your-metorial-api-key...") # async by default
together_client = AsyncOpenAI(
api_key="...your-together-api-key...",
base_url="https://api.together.xyz/v1"
)
# One-liner chat with automatic session management
response = await metorial.run(
"What are the latest commits in the metorial/websocket-explorer repository?",
"...your-mcp-server-deployment-id...", # can also be list
together_client,
model="meta-llama/Meta-Llama-3.1-70B-Instruct-Turbo",
max_iterations=25
)
print("Response:", response)
asyncio.run(main())
Streaming Chat
import asyncio
from openai import AsyncOpenAI
from metorial import Metorial
from metorial.types import StreamEventType
async def example():
# Initialize clients
metorial = Metorial(api_key="...your-metorial-api-key...")
together_client = AsyncOpenAI(
api_key="...your-together-api-key...",
base_url="https://api.together.xyz/v1"
)
# Streaming chat with real-time responses
async def stream_action(session):
messages = [
{"role": "user", "content": "Explain quantum computing"}
]
async for event in metorial.stream(
together_client, session, messages,
model="meta-llama/Meta-Llama-3.1-70B-Instruct-Turbo",
max_iterations=25
):
if event.type == StreamEventType.CONTENT:
print(f"🤖 {event.content}", end="", flush=True)
elif event.type == StreamEventType.TOOL_CALL:
print(f"\n🔧 Executing {len(event.tool_calls)} tool(s)...")
elif event.type == StreamEventType.COMPLETE:
print(f"\n✅ Complete!")
await metorial.with_session("...your-server-deployment-id...", stream_action)
asyncio.run(example())
Advanced Usage with Session Management
import asyncio
from openai import OpenAI
from metorial import Metorial
from metorial_togetherai import MetorialTogetherAISession
async def main():
# Initialize clients
metorial = Metorial(api_key="...your-metorial-api-key...")
# Together AI uses OpenAI-compatible client
together_client = OpenAI(
api_key="...your-together-api-key...",
base_url="https://api.together.xyz/v1"
)
# Create session with your server deployments
async with metorial.session(["...your-server-deployment-id..."]) as session:
# Create Together AI-specific wrapper
together_session = MetorialTogetherAISession(session.tool_manager)
messages = [
{"role": "user", "content": "What are the latest commits?"}
]
response = together_client.chat.completions.create(
model="meta-llama/Meta-Llama-3.1-70B-Instruct-Turbo",
messages=messages,
tools=together_session.tools
)
# Handle tool calls
tool_calls = response.choices[0].message.tool_calls
if tool_calls:
tool_responses = await together_session.call_tools(tool_calls)
# Add to conversation
messages.append({
"role": "assistant",
"tool_calls": tool_calls
})
messages.extend(tool_responses)
# Continue conversation...
asyncio.run(main())
Using Convenience Functions
from metorial_togetherai import build_togetherai_tools, call_togetherai_tools
async def example():
# Get tools in Together AI format
tools = build_togetherai_tools(tool_manager)
# Call tools from Together AI response
tool_messages = await call_togetherai_tools(tool_manager, tool_calls)
API Reference
MetorialTogetherAISession
Main session class for Together AI integration.
session = MetorialTogetherAISession(tool_manager)
Properties:
tools: List of tools in OpenAI-compatible format
Methods:
async call_tools(tool_calls): Execute tool calls and return tool messages
build_togetherai_tools(tool_mgr)
Build Together AI-compatible tool definitions.
Returns: List of tool definitions in OpenAI format
call_togetherai_tools(tool_mgr, tool_calls)
Execute tool calls from Together AI response.
Returns: List of tool messages
Tool Format
Tools are converted to OpenAI-compatible format (without strict mode):
{
"type": "function",
"function": {
"name": "tool_name",
"description": "Tool description",
"parameters": {
"type": "object",
"properties": {...},
"required": [...]
}
}
}
Together AI API Configuration
Together AI uses the OpenAI-compatible API format. Configure your client like this:
from openai import OpenAI
client = OpenAI(
api_key="...your-together-api-key...",
base_url="https://api.together.xyz/v1"
)
Error Handling
try:
tool_messages = await together_session.call_tools(tool_calls)
except Exception as e:
print(f"Tool execution failed: {e}")
Tool errors are returned as tool messages with error content.
License
MIT License - see LICENSE file for details.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file metorial_togetherai-1.0.0rc6.tar.gz.
File metadata
- Download URL: metorial_togetherai-1.0.0rc6.tar.gz
- Upload date:
- Size: 5.9 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
80260f3ef385b3f3fc1c25e3a2d3888cbe7aaff1e4e91fc3bf7babbc81e2a8e6
|
|
| MD5 |
b8d935d3e9146f577e72866286e3cea1
|
|
| BLAKE2b-256 |
fa473fc98d8aca89ffaf4719d7b8598f339a27656a8f8750fe8b5404caabf398
|
Provenance
The following attestation bundles were made for metorial_togetherai-1.0.0rc6.tar.gz:
Publisher:
release.yml on metorial/metorial-python
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
metorial_togetherai-1.0.0rc6.tar.gz -
Subject digest:
80260f3ef385b3f3fc1c25e3a2d3888cbe7aaff1e4e91fc3bf7babbc81e2a8e6 - Sigstore transparency entry: 543761173
- Sigstore integration time:
-
Permalink:
metorial/metorial-python@1664713bfab78ae686cc7699d61f3b52f9559b11 -
Branch / Tag:
refs/heads/main - Owner: https://github.com/metorial
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
release.yml@1664713bfab78ae686cc7699d61f3b52f9559b11 -
Trigger Event:
workflow_dispatch
-
Statement type:
File details
Details for the file metorial_togetherai-1.0.0rc6-py3-none-any.whl.
File metadata
- Download URL: metorial_togetherai-1.0.0rc6-py3-none-any.whl
- Upload date:
- Size: 4.9 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
3885636f8b5a91fbfc1acb6b38b5d29aa583e887bff97982eb9426a7160cc948
|
|
| MD5 |
3dea566fcc77bbd2b54f3bf35598c31f
|
|
| BLAKE2b-256 |
db85a475898bd049aab49a390ae133232b2866272fa4567eb7be37b1c4d54e0c
|
Provenance
The following attestation bundles were made for metorial_togetherai-1.0.0rc6-py3-none-any.whl:
Publisher:
release.yml on metorial/metorial-python
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
metorial_togetherai-1.0.0rc6-py3-none-any.whl -
Subject digest:
3885636f8b5a91fbfc1acb6b38b5d29aa583e887bff97982eb9426a7160cc948 - Sigstore transparency entry: 543761174
- Sigstore integration time:
-
Permalink:
metorial/metorial-python@1664713bfab78ae686cc7699d61f3b52f9559b11 -
Branch / Tag:
refs/heads/main - Owner: https://github.com/metorial
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
release.yml@1664713bfab78ae686cc7699d61f3b52f9559b11 -
Trigger Event:
workflow_dispatch
-
Statement type: