Skip to main content

No project description provided

Project description

langfuse-haystack

PyPI - Version PyPI - Python Version

langfuse-haystack integrates tracing capabilities into Haystack (2.x) pipelines using Langfuse. This package enhances the visibility of pipeline runs by capturing comprehensive details of the execution traces, including API calls, context data, prompts, and more. Whether you're monitoring model performance, pinpointing areas for improvement, or creating datasets for fine-tuning and testing from your pipeline executions, langfuse-haystack is the right tool for you.

Features

  • Easy integration with Haystack pipelines
  • Capture the full context of the execution
  • Track model usage and cost
  • Collect user feedback
  • Identify low-quality outputs
  • Build fine-tuning and testing datasets

Installation

To install langfuse-haystack, run the following command:

pip install langfuse-haystack

Usage

To enable tracing in your Haystack pipeline, add the LangfuseConnector to your pipeline. You also need to set the LANGFUSE_SECRET_KEY and LANGFUSE_PUBLIC_KEY environment variables in order to connect to Langfuse account. You can get these keys by signing up for an account on the Langfuse website.

⚠️ Important: To ensure proper tracing, always set environment variables before importing any Haystack components. This is crucial because Haystack initializes its internal tracing components during import.

Here's the correct way to set up your script:

import os

# Set environment variables first
os.environ["LANGFUSE_HOST"] = "https://cloud.langfuse.com"
os.environ["TOKENIZERS_PARALLELISM"] = "false"
os.environ["HAYSTACK_CONTENT_TRACING_ENABLED"] = "true"

# Then import Haystack components
from haystack.components.builders import ChatPromptBuilder
from haystack.components.generators.chat import OpenAIChatGenerator
from haystack.dataclasses import ChatMessage
from haystack import Pipeline

from haystack_integrations.components.connectors.langfuse import LangfuseConnector

# Rest of your code...

Alternatively, an even better practice is to set these environment variables in your shell before running the script.

Here's a full example:

import os

os.environ["LANGFUSE_HOST"] = "https://cloud.langfuse.com"
os.environ["TOKENIZERS_PARALLELISM"] = "false"
os.environ["HAYSTACK_CONTENT_TRACING_ENABLED"] = "true"

from haystack.components.builders import ChatPromptBuilder
from haystack.components.generators.chat import OpenAIChatGenerator
from haystack.dataclasses import ChatMessage
from haystack import Pipeline

from haystack_integrations.components.connectors.langfuse import LangfuseConnector

if __name__ == "__main__":
    pipe = Pipeline()
    pipe.add_component("tracer", LangfuseConnector("Chat example"))
    pipe.add_component("prompt_builder", ChatPromptBuilder())
    pipe.add_component("llm", OpenAIChatGenerator(model="gpt-3.5-turbo"))

    pipe.connect("prompt_builder.prompt", "llm.messages")

    messages = [
        ChatMessage.from_system("Always respond in German even if some input data is in other languages."),
        ChatMessage.from_user("Tell me about {{location}}"),
    ]

    response = pipe.run(
        data={"prompt_builder": {"template_variables": {"location": "Berlin"}, "template": messages}}
    )
    print(response["llm"]["replies"][0])
    print(response["tracer"]["trace_url"])

In this example, we add the LangfuseConnector to the pipeline with the name "tracer". Each run of the pipeline produces one trace viewable on the Langfuse website with a specific URL. The trace captures the entire execution context, including the prompts, completions, and metadata.

Trace Visualization

Langfuse provides a user-friendly interface to visualize and analyze the traces generated by your Haystack pipeline. Login into your Langfuse account and navigate to the trace URL to view the trace details.

Contributing

hatch is the best way to interact with this project. To install it, run:

pip install hatch

With hatch installed, run all the tests:

hatch run test

Run the linters ruff and mypy:

hatch run lint:all

License

langfuse-haystack is distributed under the terms of the Apache-2.0 license.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

langfuse_haystack-0.6.0.tar.gz (16.8 kB view details)

Uploaded Source

Built Distribution

langfuse_haystack-0.6.0-py3-none-any.whl (12.4 kB view details)

Uploaded Python 3

File details

Details for the file langfuse_haystack-0.6.0.tar.gz.

File metadata

  • Download URL: langfuse_haystack-0.6.0.tar.gz
  • Upload date:
  • Size: 16.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: python-httpx/0.27.2

File hashes

Hashes for langfuse_haystack-0.6.0.tar.gz
Algorithm Hash digest
SHA256 4dbc060bad50d601549dea1431bed067c1024eeaa9b62c1fffd7b68985a8232c
MD5 293695bfd70a75cc3b2bfd3b7e826869
BLAKE2b-256 ea2cd245558f632822aa9d2c9b4a2f739482e1cda76412e4641cc3efdf801428

See more details on using hashes here.

File details

Details for the file langfuse_haystack-0.6.0-py3-none-any.whl.

File metadata

File hashes

Hashes for langfuse_haystack-0.6.0-py3-none-any.whl
Algorithm Hash digest
SHA256 3adc6c0477d42e7f32357522229f278b1486674e211adfe0196e4b9f7f5016f2
MD5 ad485218d7b8bb7cabad2e36fdfced1c
BLAKE2b-256 2ac1b531915609fbe829847c2283eb1792f8d5a31c54792678ebd5a3a637ae4e

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page