OpenTelemetry instrumentation for Groq
Project description
Groq OpenTelemetry Integration
Overview
This integration provides support for using OpenTelemetry with the Groq framework. It enables tracing and monitoring of applications built with Groq.
Installation
- Install traceAI Groq
pip install traceAI-groq
Set Environment Variables
Set up your environment variables to authenticate with FutureAGI.
import os
os.environ["FI_API_KEY"] = FI_API_KEY
os.environ["FI_SECRET_KEY"] = FI_SECRET_KEY
os.environ["GROQ_API_KEY"] = GROQ_API_KEY
Quickstart
Register Tracer Provider
Set up the trace provider to establish the observability pipeline. The trace provider:
from fi_instrumentation import register
from fi_instrumentation.fi_types import ProjectType
trace_provider = register(
project_type=ProjectType.OBSERVE,
project_name="groq_app"
)
Configure Groq Instrumentation
Instrument the Groq client to enable telemetry collection. This step ensures that all interactions with the Groq SDK are tracked and monitored.
from groq import Groq
def test():
client = Groq(
api_key=os.environ.get("GROQ_API_KEY"),
)
weather_function = {
"type": "function",
"function": {
"name": "get_weather",
"description": "finds the weather for a given city",
"parameters": {
"type": "object",
"properties": {
"city": {
"type": "string",
"description": "The city to find the weather for, e.g. 'London'",
}
},
"required": ["city"],
},
},
}
sys_prompt = "Respond to the user's query using the correct tool."
user_msg = "What's the weather like in San Francisco?"
messages = [
{"role": "system", "content": sys_prompt},
{"role": "user", "content": user_msg},
]
response = client.chat.completions.create(
model="mixtral-8x7b-32768",
messages=messages,
temperature=0.0,
tools=[weather_function],
tool_choice="required",
)
message = response.choices[0].message
assert (tool_calls := message.tool_calls)
tool_call_id = tool_calls[0].id
messages.append(message)
messages.append(
ChatCompletionToolMessageParam(
content="sunny", role="tool", tool_call_id=tool_call_id
),
)
final_response = client.chat.completions.create(
model="mixtral-8x7b-32768",
messages=messages,
)
return final_response
if __name__ == "__main__":
response = test()
print("Response\n")
print(response)
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
traceai_groq-0.1.7.tar.gz
(9.7 kB
view details)
Built Distribution
File details
Details for the file traceai_groq-0.1.7.tar.gz
.
File metadata
- Download URL: traceai_groq-0.1.7.tar.gz
- Upload date:
- Size: 9.7 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/2.0.0 CPython/3.13.0 Darwin/24.1.0
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 |
6b150724312e543f0135a4f3032623a5842536d246f8877fd11fe85ed8f233de
|
|
MD5 |
0c113705a76a8e1eefe06ad6242b4b17
|
|
BLAKE2b-256 |
73b6e9c2e0e81a9fa18132634265f9e9b946f09eb4aee3c054d0f2d8a3655b58
|
File details
Details for the file traceai_groq-0.1.7-py3-none-any.whl
.
File metadata
- Download URL: traceai_groq-0.1.7-py3-none-any.whl
- Upload date:
- Size: 12.3 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/2.0.0 CPython/3.13.0 Darwin/24.1.0
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 |
3ff52d206d370328aa5b6b29f060001bbfdfdb5631a08af45c8a8c23208165a1
|
|
MD5 |
eee3885f843f741328556bf1e494bedc
|
|
BLAKE2b-256 |
eaa7620232babb84026f76ced3c5499c46534b9f3c062697ba766e83feec10e7
|