Skip to main content

OpenInference Groq Instrumentation

Project description

OpenInference Groq Instrumentation

Python autoinstrumentation library for the Groq package

This package implements OpenInference tracing for both Groq and AsyncGroq clients.

These traces are fully OpenTelemetry compatible and can be sent to an OpenTelemetry collector for viewing, such as Arize phoenix.

Installation

pip install openinference-instrumentation-groq

Quickstart

Through your terminal, install required packages.

pip install openinference-instrumentation-groq groq arize-phoenix opentelemetry-sdk opentelemetry-exporter-otlp

You can start Phoenix with the following terminal command:

python -m phoenix.server.main serve

By default, Phoenix listens on http://localhost:6006. You can visit the app via a browser at the same address. (Phoenix does not send data over the internet. It only operates locally on your machine.)

Try the following code in a Python file.

  1. Set up GroqInstrumentor to trace your application and sends the traces to Phoenix.
  2. Then, set your Groq API key as an environment variable.
  3. Lastly, create a Groq client, make a request, then go see your results in Phoenix at http://localhost:6006!
import os
from groq import Groq
from openinference.instrumentation.groq import GroqInstrumentor
from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter
from opentelemetry.sdk import trace as trace_sdk
from opentelemetry.sdk.trace.export import SimpleSpanProcessor

# Configure GroqInstrumentor with Phoenix endpoint
endpoint = "http://127.0.0.1:6006/v1/traces"
tracer_provider = trace_sdk.TracerProvider()
tracer_provider.add_span_processor(SimpleSpanProcessor(OTLPSpanExporter(endpoint)))

GroqInstrumentor().instrument(tracer_provider=tracer_provider)

os.environ["GROQ_API_KEY"] = "YOUR_KEY_HERE"

client = Groq()

chat_completion = client.chat.completions.create(
    messages=[
        {
            "role": "user",
            "content": "Explain the importance of low latency LLMs",
        }
    ],
    model="llama3-8b-8192",
)

if __name__ == "__main__":
    print(chat_completion.choices[0].message.content)

Now, on the Phoenix UI on your browser, you should see the traces from your Groq application. Click on a trace, then the "Attributes" tab will provide you with in-depth information regarding execution!

More Info

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

openinference_instrumentation_groq-0.1.12.tar.gz (11.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

File details

Details for the file openinference_instrumentation_groq-0.1.12.tar.gz.

File metadata

File hashes

Hashes for openinference_instrumentation_groq-0.1.12.tar.gz
Algorithm Hash digest
SHA256 9294eb92b785c6e19d18e151f09c6a048b15387b79a49f9fe32f6b61b90be6f9
MD5 87bfc1f442ffa048220057621208872b
BLAKE2b-256 fe0a50fd4223782c762638e855a0dc882b88c064bff4d31c5718398519e28526

See more details on using hashes here.

File details

Details for the file openinference_instrumentation_groq-0.1.12-py3-none-any.whl.

File metadata

File hashes

Hashes for openinference_instrumentation_groq-0.1.12-py3-none-any.whl
Algorithm Hash digest
SHA256 7141ea4d3b4137475c0ba6166c9c4442970c19efc7a8a24befd951ff23fde51d
MD5 7ff50d640b7b467cd31c138078938161
BLAKE2b-256 d451596b27372c71d936bb43bfedd89dcbe278f8ccb8e5f1e4dc22cafe067009

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page