Skip to main content

No project description provided

Project description

langfuse-haystack

PyPI - Version PyPI - Python Version

langfuse-haystack integrates tracing capabilities into Haystack (2.x) pipelines using Langfuse. This package enhances the visibility of pipeline runs by capturing comprehensive details of the execution traces, including API calls, context data, prompts, and more. Whether you're monitoring model performance, pinpointing areas for improvement, or creating datasets for fine-tuning and testing from your pipeline executions, langfuse-haystack is the right tool for you.

Features

  • Easy integration with Haystack pipelines
  • Capture the full context of the execution
  • Track model usage and cost
  • Collect user feedback
  • Identify low-quality outputs
  • Build fine-tuning and testing datasets

Installation

To install langfuse-haystack, run the following command:

pip install langfuse-haystack

Usage

To enable tracing in your Haystack pipeline, add the LangfuseConnector to your pipeline. You also need to set the LANGFUSE_SECRET_KEY and LANGFUSE_PUBLIC_KEY environment variables in order to connect to Langfuse account. You can get these keys by signing up for an account on the Langfuse website.

⚠️ Important: To ensure proper tracing, always set environment variables before importing any Haystack components. This is crucial because Haystack initializes its internal tracing components during import.

Here's the correct way to set up your script:

import os

# Set environment variables first
os.environ["LANGFUSE_HOST"] = "https://cloud.langfuse.com"
os.environ["TOKENIZERS_PARALLELISM"] = "false"
os.environ["HAYSTACK_CONTENT_TRACING_ENABLED"] = "true"

# Then import Haystack components
from haystack.components.builders import ChatPromptBuilder
from haystack.components.generators.chat import OpenAIChatGenerator
from haystack.dataclasses import ChatMessage
from haystack import Pipeline

from haystack_integrations.components.connectors.langfuse import LangfuseConnector

# Rest of your code...

Alternatively, an even better practice is to set these environment variables in your shell before running the script.

Here's a full example:

import os

os.environ["LANGFUSE_HOST"] = "https://cloud.langfuse.com"
os.environ["TOKENIZERS_PARALLELISM"] = "false"
os.environ["HAYSTACK_CONTENT_TRACING_ENABLED"] = "true"

from haystack.components.builders import ChatPromptBuilder
from haystack.components.generators.chat import OpenAIChatGenerator
from haystack.dataclasses import ChatMessage
from haystack import Pipeline

from haystack_integrations.components.connectors.langfuse import LangfuseConnector

if __name__ == "__main__":
    pipe = Pipeline()
    pipe.add_component("tracer", LangfuseConnector("Chat example"))
    pipe.add_component("prompt_builder", ChatPromptBuilder())
    pipe.add_component("llm", OpenAIChatGenerator(model="gpt-3.5-turbo"))

    pipe.connect("prompt_builder.prompt", "llm.messages")

    messages = [
        ChatMessage.from_system("Always respond in German even if some input data is in other languages."),
        ChatMessage.from_user("Tell me about {{location}}"),
    ]

    response = pipe.run(
        data={"prompt_builder": {"template_variables": {"location": "Berlin"}, "template": messages}}
    )
    print(response["llm"]["replies"][0])
    print(response["tracer"]["trace_url"])

In this example, we add the LangfuseConnector to the pipeline with the name "tracer". Each run of the pipeline produces one trace viewable on the Langfuse website with a specific URL. The trace captures the entire execution context, including the prompts, completions, and metadata.

Trace Visualization

Langfuse provides a user-friendly interface to visualize and analyze the traces generated by your Haystack pipeline. Login into your Langfuse account and navigate to the trace URL to view the trace details.

Contributing

hatch is the best way to interact with this project. To install it, run:

pip install hatch

With hatch installed, run all the tests:

hatch run test

Run the linters ruff and mypy:

hatch run lint:all

License

langfuse-haystack is distributed under the terms of the Apache-2.0 license.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

langfuse_haystack-0.6.2.tar.gz (17.5 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

langfuse_haystack-0.6.2-py3-none-any.whl (12.6 kB view details)

Uploaded Python 3

File details

Details for the file langfuse_haystack-0.6.2.tar.gz.

File metadata

  • Download URL: langfuse_haystack-0.6.2.tar.gz
  • Upload date:
  • Size: 17.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: python-httpx/0.28.1

File hashes

Hashes for langfuse_haystack-0.6.2.tar.gz
Algorithm Hash digest
SHA256 77233918e4a20e7cb52adecb8ef6d2e552429dbe2c53f9ea7bef760a65e6973e
MD5 c49d4ee5df5904b1954b57c56e4fcb4e
BLAKE2b-256 458feb266d2790311565d3796b5c0071c8575c9a49c4e00c3e2dbb78e1ee007b

See more details on using hashes here.

File details

Details for the file langfuse_haystack-0.6.2-py3-none-any.whl.

File metadata

File hashes

Hashes for langfuse_haystack-0.6.2-py3-none-any.whl
Algorithm Hash digest
SHA256 7dd95c1c5e9ab9b9ec5720eff43cbfeeb250b83873c907307c154eeead987ddb
MD5 ca123df2e60c74d9d46e4e41af5942e2
BLAKE2b-256 bda0fe5ab6aaff44c5616485e7c6dc4f89fb55f88b566ece4aaf769f45cbe88b

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page