Skip to main content

OpenInference Groq Instrumentation

Project description

OpenInference Groq Instrumentation

Python autoinstrumentation library for the Groq package

This package implements OpenInference tracing for both Groq and AsyncGroq clients.

These traces are fully OpenTelemetry compatible and can be sent to an OpenTelemetry collector for viewing, such as Arize phoenix.

Installation

pip install openinference-instrumentation-groq

Quickstart

Through your terminal, install required packages.

pip install openinference-instrumentation-groq groq arize-phoenix opentelemetry-sdk opentelemetry-exporter-otlp

You can start Phoenix with the following terminal command:

python -m phoenix.server.main serve

By default, Phoenix listens on http://localhost:6006. You can visit the app via a browser at the same address. (Phoenix does not send data over the internet. It only operates locally on your machine.)

Try the following code in a Python file.

  1. Set up GroqInstrumentor to trace your application and sends the traces to Phoenix.
  2. Then, set your Groq API key as an environment variable.
  3. Lastly, create a Groq client, make a request, then go see your results in Phoenix at http://localhost:6006!
import os
from groq import Groq
from openinference.instrumentation.groq import GroqInstrumentor
from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter
from opentelemetry.sdk import trace as trace_sdk
from opentelemetry.sdk.trace.export import SimpleSpanProcessor

# Configure GroqInstrumentor with Phoenix endpoint
endpoint = "http://127.0.0.1:6006/v1/traces"
tracer_provider = trace_sdk.TracerProvider()
tracer_provider.add_span_processor(SimpleSpanProcessor(OTLPSpanExporter(endpoint)))

GroqInstrumentor().instrument(tracer_provider=tracer_provider)

os.environ["GROQ_API_KEY"] = "YOUR_KEY_HERE"

client = Groq()

chat_completion = client.chat.completions.create(
    messages=[
        {
            "role": "user",
            "content": "Explain the importance of low latency LLMs",
        }
    ],
    model="llama3-8b-8192",
)

if __name__ == "__main__":
    print(chat_completion.choices[0].message.content)

Now, on the Phoenix UI on your browser, you should see the traces from your Groq application. Click on a trace, then the "Attributes" tab will provide you with in-depth information regarding execution!

More Info

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

openinference_instrumentation_groq-0.1.13.tar.gz (11.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

File details

Details for the file openinference_instrumentation_groq-0.1.13.tar.gz.

File metadata

File hashes

Hashes for openinference_instrumentation_groq-0.1.13.tar.gz
Algorithm Hash digest
SHA256 f137b59484db2671f4be1b3249bcbb034f15fa687c41e4df7a578a804114270c
MD5 eff676bdf65a4790fdc62a3fe938709f
BLAKE2b-256 eca911c924ef410183aa8e40991f97c672b84b4b32168b6366c6b14c5963a265

See more details on using hashes here.

File details

Details for the file openinference_instrumentation_groq-0.1.13-py3-none-any.whl.

File metadata

File hashes

Hashes for openinference_instrumentation_groq-0.1.13-py3-none-any.whl
Algorithm Hash digest
SHA256 a6483ee1df11e36adf4c62e0147a1e668e37a4398200d0dce8c5d3300a68ad49
MD5 c13dcbff1abcf2cb9811d27f5bd12609
BLAKE2b-256 d4b9e989456220de2a07d6d8462d42230647697d9780887e9285e66b4a977044

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page