OpenInference Guardrails Instrumentation
Project description
OpenInference guardrails Instrumentation
Python auto-instrumentation library for LLM applications implemented with Guardrails
Guards are fully OpenTelemetry-compatible and can be sent to an OpenTelemetry collector for monitoring, such as arize-phoenix
.
Installation
pip install openinference-instrumentation-guardrails
Quickstart
This quickstart shows you how to instrument your guardrailed LLM application
Install required packages.
pip install openinference-instrumentation-guardrails guardrails-ai arize-phoenix opentelemetry-sdk opentelemetry-exporter-otlp
Start Phoenix in the background as a collector. By default, it listens on http://localhost:6006
. You can visit the app via a browser at the same address. (Phoenix does not send data over the internet. It only operates locally on your machine.)
python -m phoenix.server.main serve
Install the TwoWords validator that's used in the Guard.
guardrails hub install hub://guardrails/two_words
Set up GuardrailsInstrumentor
to trace your guardrails application and sends the traces to Phoenix at the endpoint defined below.
from openinference.instrumentation.guardrails import GuardrailsInstrumentor
from opentelemetry import trace as trace_api
from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter
from opentelemetry.sdk import trace as trace_sdk
from opentelemetry.sdk.trace.export import SimpleSpanProcessor
import os
os.environ["OPENAI_API_KEY"] = "YOUR_KEY_HERE"
endpoint = "http://127.0.0.1:6006/v1/traces"
tracer_provider = trace_sdk.TracerProvider()
tracer_provider.add_span_processor(SimpleSpanProcessor(OTLPSpanExporter(endpoint)))
trace_api.set_tracer_provider(tracer_provider)
GuardrailsInstrumentor().instrument()
Set up a simple example of LLM call using a Guard
from guardrails import Guard
from guardrails.hub import TwoWords
import openai
guard = Guard().use(
TwoWords(),
)
response = guard(
llm_api=openai.chat.completions.create,
prompt="What is another name for America?",
model="gpt-3.5-turbo",
max_tokens=1024,
)
print(response)
More Info
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file openinference_instrumentation_guardrails-0.1.2.tar.gz
.
File metadata
- Download URL: openinference_instrumentation_guardrails-0.1.2.tar.gz
- Upload date:
- Size: 9.6 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.1 CPython/3.12.7
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 38761d7cfd45c541c01be97933f3a484fa33949a69a0f191b63c3a588b184e4d |
|
MD5 | d53fad34e9cfc1496243ed3f2e454b9f |
|
BLAKE2b-256 | 268816a8ac7bd95637483eb5b5a31055ee2e698cd0b47c16a2f43e6fb55af345 |
File details
Details for the file openinference_instrumentation_guardrails-0.1.2-py3-none-any.whl
.
File metadata
- Download URL: openinference_instrumentation_guardrails-0.1.2-py3-none-any.whl
- Upload date:
- Size: 10.8 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.1 CPython/3.12.7
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 3d062bd34c38d8c5e707b0cbc9fb8b7f55b82fd67766d704b5ba87129972db93 |
|
MD5 | 0c24b142383148f8f6fb97df87c9d945 |
|
BLAKE2b-256 | 81d41419381fea76f4e00bafa27a636c714ccd75165b00ad1a72ef2b50fb36f9 |