Skip to main content

OpenInference DSPy Instrumentation

Project description

OpenInference DSPy Instrumentation

pypi

Python auto-instrumentation library for DSPy.

These traces are fully OpenTelemetry-compatible and can be sent to an OpenTelemetry collector for viewing, such as arize-phoenix.

Installation

pip install openinference-instrumentation-dspy

Quickstart

This quickstart shows you how to instrument your DSPy application. It is adapted from the DSPy quickstart.

Install required packages.

pip install openinference-instrumentation-dspy dspy-ai arize-phoenix opentelemetry-sdk opentelemetry-exporter-otlp

Start Phoenix in the background as a collector. By default, it listens on http://localhost:6006. You can visit the app via a browser at the same address. (Phoenix does not send data over the internet. It only operates locally on your machine.)

python -m phoenix.server.main serve

Set up DSPyInstrumentor to trace your DSPy application and sends the traces to Phoenix at the endpoint defined below.

from openinference.instrumentation.dspy import DSPyInstrumentor
from opentelemetry import trace as trace_api
from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter
from opentelemetry.sdk import trace as trace_sdk
from opentelemetry.sdk.trace.export import SimpleSpanProcessor

endpoint = "http://127.0.0.1:6006/v1/traces"
tracer_provider = trace_sdk.TracerProvider()
trace_api.set_tracer_provider(tracer_provider)
tracer_provider.add_span_processor(SimpleSpanProcessor(OTLPSpanExporter(endpoint)))

DSPyInstrumentor().instrument()

Import dspy and configure your language model.

import dspy
from dspy.datasets.gsm8k import GSM8K, gsm8k_metric

turbo = dspy.OpenAI(model='gpt-3.5-turbo-instruct', max_tokens=250)
dspy.settings.configure(lm=turbo)
gms8k = GSM8K()
gsm8k_trainset, gsm8k_devset = gms8k.train[:10], gms8k.dev[:10]

Define a custom program that utilizes the ChainOfThought module to perform step-by-step reasoning to generate answers.

class CoT(dspy.Module):
    def __init__(self):
        super().__init__()
        self.prog = dspy.ChainOfThought("question -> answer")
    
    def forward(self, question):
        return self.prog(question=question)

Optimize your program using the BootstrapFewShotWithRandomSearch teleprompter.

from dspy.teleprompt import BootstrapFewShot

config = dict(max_bootstrapped_demos=4, max_labeled_demos=4)
teleprompter = BootstrapFewShot(metric=gsm8k_metric, **config)
optimized_cot = teleprompter.compile(CoT(), trainset=gsm8k_trainset, valset=gsm8k_devset)

Evaluate performance on the dev dataset.

from dspy.evaluate import Evaluate

evaluate = Evaluate(devset=gsm8k_devset, metric=gsm8k_metric, num_threads=4, display_progress=True, display_table=0)
evaluate(optimized_cot)

Visit the Phoenix app at http://localhost:6006 to see your traces.

More Info

More details about tracing with OpenInference and Phoenix can be found in the Phoenix docs.

For AI/ML observability solutions in production, including a cloud-based trace collector, visit Arize.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

openinference_instrumentation_dspy-0.1.7.tar.gz (14.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

File details

Details for the file openinference_instrumentation_dspy-0.1.7.tar.gz.

File metadata

File hashes

Hashes for openinference_instrumentation_dspy-0.1.7.tar.gz
Algorithm Hash digest
SHA256 f92295ab39e5f728d17d475c7215d96085b90b8a5f7c3f5df9318fe2bfa430bf
MD5 cef259c76d9f82f252a59f4b42a6a7b5
BLAKE2b-256 47bd0ee240a1d9c62b875efd186105c0e84e9ee2113fa94dab53c2f1e6ab9e1b

See more details on using hashes here.

File details

Details for the file openinference_instrumentation_dspy-0.1.7-py3-none-any.whl.

File metadata

File hashes

Hashes for openinference_instrumentation_dspy-0.1.7-py3-none-any.whl
Algorithm Hash digest
SHA256 93f50fb3e4a74867988f2d03e955f86e744f8f50a68bc61891209a3a6ecb84b0
MD5 d10f81836cd622c2e76459bfbf815cbe
BLAKE2b-256 59edea54bc883ab612594781ef5b5756419aea0039646ab7964246b4c20da3ef

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page