Skip to main content

OpenTelemetry instrumentation for LiteLLM

Project description

Litellm OpenTelemetry Integration

Overview

This integration provides support for using OpenTelemetry with the Litellm framework. It enables tracing and monitoring of applications built with Litellm.

Installation

  1. Install traceAI Litellm
pip install pip install traceAI-litellm

Set Environment Variables

Set up your environment variables to authenticate with FutureAGI

import os

os.environ["FI_API_KEY"] = FI_API_KEY
os.environ["FI_SECRET_KEY"] = FI_SECRET_KEY

Quickstart

Register Tracer Provider

Set up the trace provider to establish the observability pipeline. The trace provider:

from fi_instrumentation import register
from fi_instrumentation.fi_types import ProjectType

trace_provider = register(
    project_type=ProjectType.OBSERVE,
    project_name="litellm_app"
)

Configure Litellm Instrumentation

Instrument the Litellm client to enable telemetry collection. This step ensures that all interactions with the Litellm SDK are tracked and monitored.

from traceai_litellm import LiteLLMInstrumentor

LiteLLMInstrumentor().instrument(tracer_provider=trace_provider)

Create Litellm Components

Set up your Litellm client with built-in observability.

import asyncio
import litellm

async def run_examples():
    # Simple single message completion call
    litellm.completion(
        model="gpt-3.5-turbo",
        messages=[{"content": "What's the capital of China?", "role": "user"}],
    )

    # Multiple message conversation completion call with added param
    litellm.completion(
        model="gpt-3.5-turbo",
        messages=[
            {"content": "Hello, I want to bake a cake", "role": "user"},
            {
                "content": "Hello, I can pull up some recipes for cakes.",
                "role": "assistant",
            },
            {"content": "No actually I want to make a pie", "role": "user"},
        ],
        temperature=0.7,
    )

    # Multiple message conversation acompletion call with added params
    await litellm.acompletion(
        model="gpt-3.5-turbo",
        messages=[
            {"content": "Hello, I want to bake a cake", "role": "user"},
            {
                "content": "Hello, I can pull up some recipes for cakes.",
                "role": "assistant",
            },
            {"content": "No actually I want to make a pie", "role": "user"},
        ],
        temperature=0.7,
        max_tokens=20,
    )

    # Completion with retries
    litellm.completion_with_retries(
        model="gpt-3.5-turbo",
        messages=[{"content": "What's the highest grossing film ever", "role": "user"}],
    )

    # Embedding call
    litellm.embedding(
        model="text-embedding-ada-002", input=["good morning from litellm"]
    )

    # Asynchronous embedding call
    await litellm.aembedding(
        model="text-embedding-ada-002", input=["good morning from litellm"]
    )

    # Image generation call
    litellm.image_generation(model="dall-e-2", prompt="cute baby otter")

    # Asynchronous image generation call
    await litellm.aimage_generation(model="dall-e-2", prompt="cute baby otter")

asyncio.run(run_examples())

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

traceai_litellm-0.1.5.tar.gz (6.8 kB view details)

Uploaded Source

Built Distributions

traceai_litellm-0.1.5-py3-none-any.whl (7.1 kB view details)

Uploaded Python 3

traceai_litellm-0.1.5-1-py3-none-any.whl (7.1 kB view details)

Uploaded Python 3

File details

Details for the file traceai_litellm-0.1.5.tar.gz.

File metadata

  • Download URL: traceai_litellm-0.1.5.tar.gz
  • Upload date:
  • Size: 6.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/2.0.0 CPython/3.13.0 Darwin/24.1.0

File hashes

Hashes for traceai_litellm-0.1.5.tar.gz
Algorithm Hash digest
SHA256 9bf80c472538d8911e39bf4b0ca70073a8f11ab1ba7d4d96daff5251cac9e614
MD5 6087adc0d695e5a05b9364f64df835a1
BLAKE2b-256 37f0dd44334cc5eec9dbf2ac6ea6597ba6f2af5e5b2fdd90d0fceceb56f76f54

See more details on using hashes here.

File details

Details for the file traceai_litellm-0.1.5-py3-none-any.whl.

File metadata

  • Download URL: traceai_litellm-0.1.5-py3-none-any.whl
  • Upload date:
  • Size: 7.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/2.0.0 CPython/3.13.0 Darwin/24.1.0

File hashes

Hashes for traceai_litellm-0.1.5-py3-none-any.whl
Algorithm Hash digest
SHA256 3d56ac6579cb7331cddabe12353246a511b5f0665b2ac8c84a9e45f8e53ab680
MD5 024c16776fb854a6b79bad136d739a69
BLAKE2b-256 e35a6cf3c216cf5e07f7e2406629c2b70739b1f261daad4a3c1733562684392e

See more details on using hashes here.

File details

Details for the file traceai_litellm-0.1.5-1-py3-none-any.whl.

File metadata

  • Download URL: traceai_litellm-0.1.5-1-py3-none-any.whl
  • Upload date:
  • Size: 7.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/2.0.0 CPython/3.13.0 Darwin/24.1.0

File hashes

Hashes for traceai_litellm-0.1.5-1-py3-none-any.whl
Algorithm Hash digest
SHA256 aed17ce5ced6a568cef74483c77192869ecbe606dbd1a5f409350957b03f9475
MD5 eeca53323f00ea38939b1e98ab7d36c6
BLAKE2b-256 854c9bd4c00b98a53e81f2ed6ddc9c56735d2170a5eba87be3acedd81f4f1700

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page