Skip to main content

OpenInference Agno Instrumentation

Project description

OpenInference Agno Instrumentation

pypi

Python auto-instrumentation library for Agno Agents

The following instrumentation is fully OpenTelemetry-compatible and can be sent to an OpenTelemetry collector for monitoring, such as arize-phoenix or Langfuse.

Installation

pip install openinference-instrumentation-agno

Quickstart

This quickstart shows you how to instrument your Agno Agent application.

You've already installed openinference-instrumentation-agno. Next is to install packages for agno, Phoenix and opentelemetry-instrument, which exports traces to it.

pip install agno arize-phoenix opentelemetry-sdk opentelemetry-exporter-otlp-proto-grpc opentelemetry-distro

Start the Phoenix app in the background as a collector:

phoenix serve

By default, it listens on http://localhost:6006. You can visit the app via a browser at the same address.

The Phoenix app does not send data over the internet. It only operates locally on your machine.

Create a simple Agno agent:

from agno.agent import Agent
from agno.models.openai import OpenAIChat
from agno.tools.duckduckgo import DuckDuckGoTools

from openinference.instrumentation.agno import AgnoInstrumentor
from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter
from opentelemetry.sdk import trace as trace_sdk
from opentelemetry import trace as trace_api
from opentelemetry.sdk.trace.export import ConsoleSpanExporter, SimpleSpanProcessor

endpoint = "http://127.0.0.1:6006/v1/traces"
tracer_provider = trace_sdk.TracerProvider()
tracer_provider.add_span_processor(SimpleSpanProcessor(OTLPSpanExporter(endpoint)))
# Optionally, you can also print the spans to the console.
tracer_provider.add_span_processor(SimpleSpanProcessor(ConsoleSpanExporter()))

trace_api.set_tracer_provider(tracer_provider=tracer_provider)

# Start instrumenting agno
AgnoInstrumentor().instrument()


agent = Agent(
    model=OpenAIChat(id="gpt-4o-mini"), 
    tools=[DuckDuckGoTools()],
    markdown=True, 
    debug_mode=True,
)

agent.print_response("What is currently trending on Twitter?")

Finally, run the example:

python example.py

Finally, browse for your trace in Phoenix at http://localhost:6006!

More Info

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

openinference_instrumentation_agno-0.1.30.tar.gz (21.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

File details

Details for the file openinference_instrumentation_agno-0.1.30.tar.gz.

File metadata

File hashes

Hashes for openinference_instrumentation_agno-0.1.30.tar.gz
Algorithm Hash digest
SHA256 b51141bd780d14a1e72165a26e93dca0cf2a660c7a41a9eb8217cdd9bd1e4241
MD5 4c8fb4f33af1ca51c9d8d027a556b47c
BLAKE2b-256 a3fae02fc4d668b4be911b201a21c21bb6adcf6540c94465d9c1abd0ba4ab63a

See more details on using hashes here.

File details

Details for the file openinference_instrumentation_agno-0.1.30-py3-none-any.whl.

File metadata

File hashes

Hashes for openinference_instrumentation_agno-0.1.30-py3-none-any.whl
Algorithm Hash digest
SHA256 6f2f0d2b179781bf6b5010ac4662209ac09e2de6fc1a34594c8300d33b55b0ee
MD5 f15f3875ad81ff197407af7d1fe9e684
BLAKE2b-256 2e4abfe9b0a751fdb6a6df8332a62352807d37545070f71beefba843cb5d693c

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page