Skip to main content

OpenInference Groq Instrumentation

Project description

OpenInference Groq Instrumentation

Python autoinstrumentation library for the Groq package

This package implements OpenInference tracing for both Groq and AsyncGroq clients.

These traces are fully OpenTelemetry compatible and can be sent to an OpenTelemetry collector for viewing, such as Arize phoenix.

Installation

pip install openinference-instrumentation-groq

Quickstart

Through your terminal, install required packages.

pip install openinference-instrumentation-groq groq arize-phoenix opentelemetry-sdk opentelemetry-exporter-otlp

You can start Phoenix with the following terminal command:

python -m phoenix.server.main serve

By default, Phoenix listens on http://localhost:6006. You can visit the app via a browser at the same address. (Phoenix does not send data over the internet. It only operates locally on your machine.)

Try the following code in a Python file.

  1. Set up GroqInstrumentor to trace your application and sends the traces to Phoenix.
  2. Then, set your Groq API key as an environment variable.
  3. Lastly, create a Groq client, make a request, then go see your results in Phoenix at http://localhost:6006!
import os
from groq import Groq
from openinference.instrumentation.groq import GroqInstrumentor
from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter
from opentelemetry.sdk import trace as trace_sdk
from opentelemetry.sdk.trace.export import SimpleSpanProcessor

# Configure GroqInstrumentor with Phoenix endpoint
endpoint = "http://127.0.0.1:6006/v1/traces"
tracer_provider = trace_sdk.TracerProvider()
tracer_provider.add_span_processor(SimpleSpanProcessor(OTLPSpanExporter(endpoint)))

GroqInstrumentor().instrument(tracer_provider=tracer_provider)

os.environ["GROQ_API_KEY"] = "YOUR_KEY_HERE"

client = Groq()

chat_completion = client.chat.completions.create(
    messages=[
        {
            "role": "user",
            "content": "Explain the importance of low latency LLMs",
        }
    ],
    model="llama3-8b-8192",
)

if __name__ == "__main__":
    print(chat_completion.choices[0].message.content)

Now, on the Phoenix UI on your browser, you should see the traces from your Groq application. Click on a trace, then the "Attributes" tab will provide you with in-depth information regarding execution!

More Info

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

Built Distribution

File details

Details for the file openinference_instrumentation_groq-0.1.4.tar.gz.

File metadata

File hashes

Hashes for openinference_instrumentation_groq-0.1.4.tar.gz
Algorithm Hash digest
SHA256 9974c818f93397218389f135a7bb8deba5cde619161db9848feda4b57f318a46
MD5 bff10fa63a0f4c6aac396a039ab35ee6
BLAKE2b-256 3c91e50b910223999fb76d8d9e4f8866f3328d098b543a2eb452e31d45c6dae4

See more details on using hashes here.

File details

Details for the file openinference_instrumentation_groq-0.1.4-py3-none-any.whl.

File metadata

File hashes

Hashes for openinference_instrumentation_groq-0.1.4-py3-none-any.whl
Algorithm Hash digest
SHA256 0d72308cb218fc74f8e6677580f33316e047247e596b07b1882e434d3b9cf0ce
MD5 b320d58395bc5cbedfa94d2efa9a9f31
BLAKE2b-256 ec81d0427e29e90d96cb916ea4cc57305a9436f582a6bca9873ed2c01ebd398d

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page