Skip to main content

LangWatch observability integration for Bridgic.

Project description

LangWatch Observability Integration

This package integrates LangWatch tracing with the Bridgic framework, providing worker granularity tracing implementation.

Installation

# Install the LangWatch tracing package
pip install bridgic-traces-langwatch

Prerequisites

LangWatch provides a hosted version of the platform, or you can run the platform locally.

Configuration

Using Environment Variables

Set the following environment variables:

export LANGWATCH_API_KEY="your-api-key-here"
export LANGWATCH_ENDPOINT="https://app.langwatch.ai"  # Optional, defaults to https://app.langwatch.ai

Usage

The LangWatchTraceCallback can be configured in two ways:

Method 1: Per-Automa Scope with RunningOptions

Apply the callback only to a single automa by configuring it through RunningOptions. In this mode, every worker instantiated by that automa receives its own callback instance, while other automa remain unaffected.

from bridgic.core.automa import GraphAutoma, RunningOptions, worker
from bridgic.core.automa.worker import WorkerCallbackBuilder
from bridgic.traces.langwatch import LangWatchTraceCallback
import asyncio

class MyAutoma(GraphAutoma):
    @worker(is_start=True)
    async def step1(self):
        return "hello"

    @worker(dependencies=["step1"], is_output=True)
    async def step2(self, step1: str):
        return f"{step1} world"

async def main():
    builder = WorkerCallbackBuilder(
        LangWatchTraceCallback,
        init_kwargs={"base_attributes": {"app": "demo"}}
    )
    running_options = RunningOptions(callback_builders=[builder])
    automa = MyAutoma(running_options=running_options)
    result = await automa.arun()
    print(result)

asyncio.run(main())

Method 2: Global Scope with GlobalSetting

You can register the callback at the global level through GlobalSetting to make it effective for every automa in the runtime. Each worker, regardless of which automa creates it, is instrumented with the same callback configuration.

from bridgic.core.automa import GraphAutoma, worker
from bridgic.core.automa.worker import WorkerCallbackBuilder
from bridgic.core.config import GlobalSetting
from bridgic.traces.langwatch import LangWatchTraceCallback
import asyncio

# Configure global callback
GlobalSetting.set(callback_builders=[WorkerCallbackBuilder(
    LangWatchTraceCallback,
    init_kwargs={"base_attributes": {"app": "demo"}}
)])

class DataAnalysisAutoma(GraphAutoma):
    @worker(is_start=True)
    async def collect_data(self, topic: str) -> dict:
        """Collect data for the given topic."""
        # Simulate data collection
        return {
            "topic": topic,
            "data_points": ["point1", "point2", "point3"],
            "timestamp": "2024-01-01"
        }

    @worker(dependencies=["collect_data"])
    async def analyze_trends(self, data: dict) -> dict:
        """Analyze trends in the collected data."""
        # Simulate trend analysis
        return {
            "trends": ["trend1", "trend2"],
            "confidence": 0.85,
            "source_data": data
        }

    @worker(dependencies=["analyze_trends"], is_output=True)
    async def generate_report(self, analysis: dict) -> str:
        """Generate a final report."""
        return f"Report: Found {len(analysis['trends'])} trends with {analysis['confidence']} confidence."

async def main():
    automa = DataAnalysisAutoma()
    result = await automa.arun(topic="market analysis")
    print(result)

if __name__ == "__main__":
    import asyncio
    asyncio.run(main())

From the perspective of tracking worker execution, the following approach is equivalent to the above one:

from bridgic.traces.langwatch import start_langwatch_trace

start_langwatch_trace(base_attributes={"app": "demo"})

However, start_langwatch_trace is a higher-level function that encapsulates the functionality of the first approach. As the framework may add tracking for more important phases in the future, start_langwatch_trace will provide a unified interface for all tracking capabilities, making it the recommended approach for most use cases. When parameters are omitted, the helper reads the values from the LANGWATCH_API_KEY and LANGWATCH_ENDPOINT environment variables.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

bridgic_traces_langwatch-0.1.0.post1.tar.gz (6.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

File details

Details for the file bridgic_traces_langwatch-0.1.0.post1.tar.gz.

File metadata

File hashes

Hashes for bridgic_traces_langwatch-0.1.0.post1.tar.gz
Algorithm Hash digest
SHA256 9e2ff50d7d0366599b22fb327b1ff03148d527f17b8eb523748de805cf41ce36
MD5 748e9ab85fa204903b2fd694b46ac064
BLAKE2b-256 a8a2f91610047a0d8f0c1ef79a113c9b8d3a1e18fda5353a9ba21bd94d352c5d

See more details on using hashes here.

File details

Details for the file bridgic_traces_langwatch-0.1.0.post1-py3-none-any.whl.

File metadata

File hashes

Hashes for bridgic_traces_langwatch-0.1.0.post1-py3-none-any.whl
Algorithm Hash digest
SHA256 1dee2268f8402f3725eaf7888d2d18abc724fea396e02e1b72a49b96077ee84c
MD5 be5a8d01053396a9a956a15bacf9b0cf
BLAKE2b-256 66b64a50a83479ac5df09f34672481812b1266518dcce6dca24713cb68c21dea

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page