Skip to main content

LangWatch observability integration for Bridgic.

Project description

LangWatch Observability Integration

This package integrates LangWatch tracing with the Bridgic framework, providing worker granularity tracing implementation.

Installation

# Install the LangWatch tracing package
pip install bridgic-traces-langwatch

Prerequisites

LangWatch provides a hosted version of the platform, or you can run the platform locally.

Configuration

Using Environment Variables

Set the following environment variables:

export LANGWATCH_API_KEY="your-api-key-here"
export LANGWATCH_ENDPOINT="https://app.langwatch.ai"  # Optional, defaults to https://app.langwatch.ai

Usage

The LangWatchTraceCallback can be configured in two ways:

Method 1: Per-Automa Scope with RunningOptions

Apply the callback only to a single automa by configuring it through RunningOptions. In this mode, every worker instantiated by that automa receives its own callback instance, while other automa remain unaffected.

from bridgic.core.automa import GraphAutoma, RunningOptions, worker
from bridgic.core.automa.worker import WorkerCallbackBuilder
from bridgic.traces.langwatch import LangWatchTraceCallback
import asyncio

class MyAutoma(GraphAutoma):
    @worker(is_start=True)
    async def step1(self):
        return "hello"

    @worker(dependencies=["step1"], is_output=True)
    async def step2(self, step1: str):
        return f"{step1} world"

async def main():
    builder = WorkerCallbackBuilder(
        LangWatchTraceCallback,
        init_kwargs={"base_attributes": {"app": "demo"}}
    )
    running_options = RunningOptions(callback_builders=[builder])
    automa = MyAutoma(running_options=running_options)
    result = await automa.arun()
    print(result)

asyncio.run(main())

Method 2: Global Scope with GlobalSetting

You can register the callback at the global level through GlobalSetting to make it effective for every automa in the runtime. Each worker, regardless of which automa creates it, is instrumented with the same callback configuration.

from bridgic.core.automa import GraphAutoma, worker
from bridgic.core.automa.worker import WorkerCallbackBuilder
from bridgic.core.config import GlobalSetting
from bridgic.traces.langwatch import LangWatchTraceCallback
import asyncio

# Configure global callback
GlobalSetting.set(callback_builders=[WorkerCallbackBuilder(
    LangWatchTraceCallback,
    init_kwargs={"base_attributes": {"app": "demo"}}
)])

class DataAnalysisAutoma(GraphAutoma):
    @worker(is_start=True)
    async def collect_data(self, topic: str) -> dict:
        """Collect data for the given topic."""
        # Simulate data collection
        return {
            "topic": topic,
            "data_points": ["point1", "point2", "point3"],
            "timestamp": "2024-01-01"
        }

    @worker(dependencies=["collect_data"])
    async def analyze_trends(self, data: dict) -> dict:
        """Analyze trends in the collected data."""
        # Simulate trend analysis
        return {
            "trends": ["trend1", "trend2"],
            "confidence": 0.85,
            "source_data": data
        }

    @worker(dependencies=["analyze_trends"], is_output=True)
    async def generate_report(self, analysis: dict) -> str:
        """Generate a final report."""
        return f"Report: Found {len(analysis['trends'])} trends with {analysis['confidence']} confidence."

async def main():
    automa = DataAnalysisAutoma()
    result = await automa.arun(topic="market analysis")
    print(result)

if __name__ == "__main__":
    import asyncio
    asyncio.run(main())

From the perspective of tracking worker execution, the following approach is equivalent to the above one:

from bridgic.traces.langwatch import start_langwatch_trace

start_langwatch_trace(base_attributes={"app": "demo"})

However, start_langwatch_trace is a higher-level function that encapsulates the functionality of the first approach. As the framework may add tracking for more important phases in the future, start_langwatch_trace will provide a unified interface for all tracking capabilities, making it the recommended approach for most use cases. When parameters are omitted, the helper reads the values from the LANGWATCH_API_KEY and LANGWATCH_ENDPOINT environment variables.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

bridgic_traces_langwatch-0.1.0.tar.gz (6.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

bridgic_traces_langwatch-0.1.0-py3-none-any.whl (8.5 kB view details)

Uploaded Python 3

File details

Details for the file bridgic_traces_langwatch-0.1.0.tar.gz.

File metadata

File hashes

Hashes for bridgic_traces_langwatch-0.1.0.tar.gz
Algorithm Hash digest
SHA256 27cf25a38c9fcaacdcc882a3ad67fd9ec623286670e2d8b46dfcd8bb3b6ddaad
MD5 c141813c04c0eaab36e54f26757e29e5
BLAKE2b-256 63eb5743819f24a36f8e59ea41947c5e70bc0a582ba0a6d9a52076e5658f4fd0

See more details on using hashes here.

File details

Details for the file bridgic_traces_langwatch-0.1.0-py3-none-any.whl.

File metadata

File hashes

Hashes for bridgic_traces_langwatch-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 9ccb57b668a151f9e81db996f414ff0cd877137ff2a6ff72016d562b35295b43
MD5 51e254ae36f5ec4a5e6409f2df95706d
BLAKE2b-256 557e91c866209269b223dca3943c7d90e23bf25f60e5ec61182dcaeba6068198

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page