Anyway Software Development Kit (SDK) for Python
Project description
anyway-sdk
Anyway's Python SDK allows you to easily start monitoring and debugging your LLM execution. Tracing is done in a non-intrusive way, built on top of OpenTelemetry. You can choose to export the traces to your existing observability stack.
Installation
pip install anyway-sdk
Quick Start
from anyway.sdk import Traceloop
from anyway.sdk.decorators import workflow, task
Traceloop.init(app_name="joke_generation_service")
@workflow(name="joke_creation")
def create_joke():
completion = openai.chat.completions.create(
model="gpt-3.5-turbo",
messages=[{"role": "user", "content": "Tell me a joke about opentelemetry"}],
)
return completion.choices[0].message.content
Configuration
The SDK is built on top of OpenTelemetry and supports exporting traces to any OTEL-compatible collector.
The protocol is determined by the URL format:
- Without
http://orhttps://prefix → gRPC (e.g.,localhost:4317) - With
http://orhttps://prefix → HTTP (e.g.,http://localhost:4318)
Connecting to Anyway Collector
Configure the SDK endpoint and authentication using one of the following methods.
Option 1: Environment Variables
export TRACELOOP_BASE_URL=localhost:4317
export TRACELOOP_HEADERS="Authorization=Bearer%20<your-api-key>"
Note: The space between Bearer and the key must be URL-encoded as %20.
Example:
export TRACELOOP_BASE_URL=localhost:4317
export TRACELOOP_HEADERS="Authorization=Bearer%20sk_test_mncd5s5tQQJLuLNhRoXcYuNuptoOPuAY"
Then initialize the SDK:
from anyway.sdk import Traceloop
Traceloop.init(app_name="my_app")
Option 2: Pass Directly to Init
from anyway.sdk import Traceloop
Traceloop.init(
app_name="my_app",
api_endpoint="localhost:4317",
headers={"Authorization": "Bearer <your-api-key>"}
)
OpenTelemetry Collector
The SDK can export traces to any OpenTelemetry Collector.
Using Environment Variables
export TRACELOOP_BASE_URL=<your-collector-endpoint>
Using a Custom Exporter
from opentelemetry.exporter.otlp.proto.grpc.trace_exporter import OTLPSpanExporter
from anyway.sdk import Traceloop
exporter = OTLPSpanExporter(endpoint="localhost:4317")
Traceloop.init(
app_name="my_app",
exporter=exporter
)
Decorators
The SDK provides @workflow and @task decorators to organize and trace your LLM operations.
Import
from anyway.sdk.decorators import workflow, task
Parameters
Both decorators accept the same parameters:
| Parameter | Type | Description |
|---|---|---|
name |
Optional[str] |
Custom name for the span. If not provided, defaults to the function name. |
@workflow
Use @workflow to define high-level operations that orchestrate multiple tasks.
@workflow(name="document_processor")
def process_document(text: str):
summary = summarize_text(text)
keywords = extract_keywords(text)
return {"summary": summary, "keywords": keywords}
@task
Use @task to define individual units of work within a workflow.
@task(name="text_summarizer")
def summarize_text(text: str):
completion = openai.chat.completions.create(
model="gpt-3.5-turbo",
messages=[{"role": "user", "content": f"Summarize: {text}"}],
)
return completion.choices[0].message.content
@task(name="keyword_extractor")
def extract_keywords(text: str):
completion = openai.chat.completions.create(
model="gpt-3.5-turbo",
messages=[{"role": "user", "content": f"Extract keywords from: {text}"}],
)
return completion.choices[0].message.content
Nested Workflows and Tasks
Workflows can call tasks, and tasks can call other tasks to create a trace hierarchy:
from anyway.sdk import Traceloop
from anyway.sdk.decorators import workflow, task
Traceloop.init(app_name="content_pipeline")
@task(name="generate_content")
def generate_content(topic: str):
completion = openai.chat.completions.create(
model="gpt-3.5-turbo",
messages=[{"role": "user", "content": f"Write about: {topic}"}],
)
return completion.choices[0].message.content
@task(name="review_content")
def review_content(content: str):
completion = openai.chat.completions.create(
model="gpt-3.5-turbo",
messages=[{"role": "user", "content": f"Review this content: {content}"}],
)
return completion.choices[0].message.content
@workflow(name="content_pipeline")
def create_content(topic: str):
content = generate_content(topic)
reviewed = review_content(content)
return reviewed
Async Support
Both decorators work seamlessly with async functions:
@task(name="async_summarizer")
async def summarize_text(text: str):
completion = await async_openai_client.chat.completions.create(
model="gpt-3.5-turbo",
messages=[{"role": "user", "content": f"Summarize: {text}"}],
)
return completion.choices[0].message.content
@workflow(name="async_pipeline")
async def process_async(text: str):
return await summarize_text(text)
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file anyway_sdk-0.0.7.tar.gz.
File metadata
- Download URL: anyway_sdk-0.0.7.tar.gz
- Upload date:
- Size: 57.6 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/2.3.3 CPython/3.14.3 Darwin/25.3.0
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
8513dadd3e4a2adcbc72f42b1cb8c787922426dd9d8eb434f805519b60bf3b09
|
|
| MD5 |
402150b1a097cf1764ac4489da7f99ef
|
|
| BLAKE2b-256 |
317a5dc1c8ddde707748ff421f1fd5352e9cbfa9f7e4cd1070ef3473c6c8c951
|
File details
Details for the file anyway_sdk-0.0.7-py3-none-any.whl.
File metadata
- Download URL: anyway_sdk-0.0.7-py3-none-any.whl
- Upload date:
- Size: 77.1 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/2.3.3 CPython/3.14.3 Darwin/25.3.0
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
ee9b35e099b94f5cb0f2cfad55362083e7cf8d55379ea3bb7613f91e8adb45ce
|
|
| MD5 |
43db512f1dfd6451d56365a19eed8ad9
|
|
| BLAKE2b-256 |
0dce6a9016b431e8626c91442f4ee6abb78ba0181c38baaa5e08bc3b6c9c9407
|