Skip to main content

OpenTelemetry instrumentation for LiteLLM

Project description

Litellm OpenTelemetry Integration

Overview

This integration provides support for using OpenTelemetry with the Litellm framework. It enables tracing and monitoring of applications built with Litellm.

Installation

  1. Install traceAI Litellm
pip install pip install traceAI-litellm

Set Environment Variables

Set up your environment variables to authenticate with FutureAGI

import os

os.environ["FI_API_KEY"] = FI_API_KEY
os.environ["FI_SECRET_KEY"] = FI_SECRET_KEY

Quickstart

Register Tracer Provider

Set up the trace provider to establish the observability pipeline. The trace provider:

from fi_instrumentation import register
from fi_instrumentation.fi_types import ProjectType

trace_provider = register(
    project_type=ProjectType.OBSERVE,
    project_name="litellm_app"
)

Configure Litellm Instrumentation

Instrument the Litellm client to enable telemetry collection. This step ensures that all interactions with the Litellm SDK are tracked and monitored.

from traceai_litellm import LiteLLMInstrumentor

LiteLLMInstrumentor().instrument(tracer_provider=trace_provider)

Create Litellm Components

Set up your Litellm client with built-in observability.

import asyncio
import litellm

async def run_examples():
    # Simple single message completion call
    litellm.completion(
        model="gpt-3.5-turbo",
        messages=[{"content": "What's the capital of China?", "role": "user"}],
    )

    # Multiple message conversation completion call with added param
    litellm.completion(
        model="gpt-3.5-turbo",
        messages=[
            {"content": "Hello, I want to bake a cake", "role": "user"},
            {
                "content": "Hello, I can pull up some recipes for cakes.",
                "role": "assistant",
            },
            {"content": "No actually I want to make a pie", "role": "user"},
        ],
        temperature=0.7,
    )

    # Multiple message conversation acompletion call with added params
    await litellm.acompletion(
        model="gpt-3.5-turbo",
        messages=[
            {"content": "Hello, I want to bake a cake", "role": "user"},
            {
                "content": "Hello, I can pull up some recipes for cakes.",
                "role": "assistant",
            },
            {"content": "No actually I want to make a pie", "role": "user"},
        ],
        temperature=0.7,
        max_tokens=20,
    )

    # Completion with retries
    litellm.completion_with_retries(
        model="gpt-3.5-turbo",
        messages=[{"content": "What's the highest grossing film ever", "role": "user"}],
    )

    # Embedding call
    litellm.embedding(
        model="text-embedding-ada-002", input=["good morning from litellm"]
    )

    # Asynchronous embedding call
    await litellm.aembedding(
        model="text-embedding-ada-002", input=["good morning from litellm"]
    )

    # Image generation call
    litellm.image_generation(model="dall-e-2", prompt="cute baby otter")

    # Asynchronous image generation call
    await litellm.aimage_generation(model="dall-e-2", prompt="cute baby otter")

asyncio.run(run_examples())

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

traceai_litellm-0.1.10.tar.gz (7.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

traceai_litellm-0.1.10-py3-none-any.whl (7.5 kB view details)

Uploaded Python 3

File details

Details for the file traceai_litellm-0.1.10.tar.gz.

File metadata

  • Download URL: traceai_litellm-0.1.10.tar.gz
  • Upload date:
  • Size: 7.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.9.21 {"installer":{"name":"uv","version":"0.9.21","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"macOS","version":null,"id":null,"libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":null}

File hashes

Hashes for traceai_litellm-0.1.10.tar.gz
Algorithm Hash digest
SHA256 c4c6880e4e43a39dc0382f0d52e90acdd260715ae2ec335077e5aa7992bd0b43
MD5 7586795ee26f57a9f89f311ef5f1ef9d
BLAKE2b-256 0d96c4e79039e60d10efeee82ef442c4f54099d7fa6323f6dfc7a86d1cc771c7

See more details on using hashes here.

File details

Details for the file traceai_litellm-0.1.10-py3-none-any.whl.

File metadata

  • Download URL: traceai_litellm-0.1.10-py3-none-any.whl
  • Upload date:
  • Size: 7.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.9.21 {"installer":{"name":"uv","version":"0.9.21","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"macOS","version":null,"id":null,"libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":null}

File hashes

Hashes for traceai_litellm-0.1.10-py3-none-any.whl
Algorithm Hash digest
SHA256 b2dece0a0767921708d69ed7c90d717c5ce15e4487ad89998a9d32adfd0cb8f6
MD5 3a38f33b3139b9f563068890fb6e8413
BLAKE2b-256 280949906012eb6d962545cdf98f137e8a6b32f819d39cbe6c5ba812473a4135

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page