Skip to main content

Real-time cost visibility & optimization for AI agents.

Project description

AgentMetrics Python SDK

PyPI version Python License

Real-time cost, latency, and error tracking for AI agents. One decorator. Zero overhead on failure.

from agentmetrics import sentinel

sentinel.configure(api_key="am_xxxxxxxxxxxxxxxx")

@sentinel.track(agent_id="customer_support")
def my_agent(task: str) -> str:
    return call_llm(task)

Every call is now tracked — duration, status, errors — visible in your AgentMetrics dashboard.


Prerequisites

You need a running AgentMetrics instance:

After setup, get your SDK key from the dashboard: Settings → SDK Keys


Install

pip install agentmetrics

Quick start

import os
from agentmetrics import sentinel

# Configure once at startup
sentinel.configure(api_key=os.environ["AGENTMETRICS_KEY"])

@sentinel.track(agent_id="customer_support")
def handle_ticket(ticket: str) -> str:
    response = openai_client.chat.completions.create(
        model="gpt-4o-mini",
        messages=[{"role": "user", "content": ticket}],
    )
    return response.choices[0].message.content

Framework examples

OpenAI

import os
from openai import OpenAI
from agentmetrics import sentinel

sentinel.configure(api_key=os.environ["AGENTMETRICS_KEY"])
client = OpenAI()

@sentinel.track(agent_id="openai_agent")
def ask(prompt: str) -> str:
    response = client.chat.completions.create(
        model="gpt-4o-mini",
        messages=[{"role": "user", "content": prompt}],
    )
    return response.choices[0].message.content

Anthropic

import os
from anthropic import Anthropic
from agentmetrics import sentinel

sentinel.configure(api_key=os.environ["AGENTMETRICS_KEY"])
client = Anthropic()

@sentinel.track(agent_id="claude_agent")
def ask(prompt: str) -> str:
    message = client.messages.create(
        model="claude-sonnet-4-6",
        max_tokens=1024,
        messages=[{"role": "user", "content": prompt}],
    )
    return message.content[0].text

LangChain

import os
from langchain_openai import ChatOpenAI
from agentmetrics import sentinel

sentinel.configure(api_key=os.environ["AGENTMETRICS_KEY"])

@sentinel.track(agent_id="langchain_agent")
def run_chain(question: str) -> str:
    llm = ChatOpenAI(model="gpt-4o-mini")
    return llm.invoke(question).content

LangGraph

import os
from agentmetrics import sentinel

sentinel.configure(api_key=os.environ["AGENTMETRICS_KEY"])

@sentinel.track(agent_id="langgraph_workflow")
def run_graph(state: dict) -> dict:
    return compiled_graph.invoke(state)

CrewAI

import os
from agentmetrics import sentinel

sentinel.configure(api_key=os.environ["AGENTMETRICS_KEY"])

@sentinel.track(agent_id="research_crew")
def run_crew(topic: str) -> str:
    return crew.kickoff(inputs={"topic": topic})

Async agents

@sentinel.track(agent_id="async_agent")
async def my_async_agent(task: str) -> str:
    result = await some_llm_call(task)
    return result

Sync and async work identically. No extra configuration needed.


Configuration

sentinel.configure(
    api_key="am_xxxxxxxxxxxxxxxx",        # from Settings → SDK Keys
    base_url="http://localhost:8000/v1",   # omit for cloud, set for self-hosted
)
Parameter Default Description
api_key required Your SDK key from the dashboard
base_url https://api.agentmetrics.dev/v1 AgentMetrics API endpoint

Tip: Load from environment:

sentinel.configure(api_key=os.environ["AGENTMETRICS_KEY"])

Graceful degradation

If the AgentMetrics server is unreachable or the key is invalid, your agent keeps running normally. The SDK never raises exceptions, never blocks execution, and never adds latency to the critical path.

Events are sent fire-and-forget in a background thread with up to 3 retries.


Flushing before exit

In short-lived scripts or serverless functions, call flush() before the process exits to ensure all queued events are sent:

sentinel.flush()  # waits up to 10 seconds

Self-hosted

sentinel.configure(
    api_key=os.environ["AGENTMETRICS_KEY"],
    base_url="http://your-server:8000/v1",
)

License

Apache 2.0

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

agentmetrics-0.1.1.tar.gz (5.5 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

agentmetrics-0.1.1-py3-none-any.whl (5.6 kB view details)

Uploaded Python 3

File details

Details for the file agentmetrics-0.1.1.tar.gz.

File metadata

  • Download URL: agentmetrics-0.1.1.tar.gz
  • Upload date:
  • Size: 5.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for agentmetrics-0.1.1.tar.gz
Algorithm Hash digest
SHA256 eac9d32536748d1ac11270d9ffc73a57adfa8bdb8c23c6a77b54d09e2ce835ec
MD5 1d8f9a3eed8feda349aac625bed00f27
BLAKE2b-256 c449d59dbe2342f3448f6ae86cbbaf347a71c57a7527a11c5d46766e1792bf3b

See more details on using hashes here.

Provenance

The following attestation bundles were made for agentmetrics-0.1.1.tar.gz:

Publisher: publish-sdk.yml on andausman/agentmetrics

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file agentmetrics-0.1.1-py3-none-any.whl.

File metadata

  • Download URL: agentmetrics-0.1.1-py3-none-any.whl
  • Upload date:
  • Size: 5.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for agentmetrics-0.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 a573f2b7a97ef0ca4aa7dcb37bf6605a811a06b0a977e44d9334a69a7881ae3f
MD5 9af9f62daec357248ac9191656c3779b
BLAKE2b-256 69e4c5f4ccb72f6d0e0fba40e140ada8db282e4d20295bc3dfd3c6ce2a701b65

See more details on using hashes here.

Provenance

The following attestation bundles were made for agentmetrics-0.1.1-py3-none-any.whl:

Publisher: publish-sdk.yml on andausman/agentmetrics

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page