Observability SDK for LLM applications
Project description
R4U Python SDK
An observability SDK for LLM applications that automatically traces and monitors your AI interactions.
Installation
pip install r4u
For OpenAI integration:
pip install r4u[openai]
For LangChain integration:
pip install r4u[langchain]
Quick Start
OpenAI Integration
from openai import OpenAI
from r4u.tracing.openai import wrap_openai
# Initialize your OpenAI client
client = OpenAI(api_key="your-api-key")
# Wrap it with R4U observability
traced_client = wrap_openai(client, api_url="http://localhost:8000")
# Use it normally - traces will be automatically created
response = traced_client.chat.completions.create(
model="gpt-3.5-turbo",
messages=[{"role": "user", "content": "Hello, world!"}]
)
LangChain Integration
from langchain_openai import ChatOpenAI
from r4u.tracing.langchain import wrap_langchain
# Create the R4U callback handler
r4u_handler = wrap_langchain(api_url="http://localhost:8000")
# Add it to your LangChain model
llm = ChatOpenAI(model="gpt-3.5-turbo", callbacks=[r4u_handler])
# Use it normally - traces will be automatically created
response = llm.invoke("Hello, world!")
The LangChain integration uses callback handlers to capture all LLM calls, including:
- Message history: All messages in a conversation are automatically captured
- Tool/Function calls: Tool definitions and invocations are tracked
- Agents: Multi-step agent executions are fully traced
- Chains: Works seamlessly with LangChain chains and runnables
For more details, see docs/LANGCHAIN_INTEGRATION.md.
You can find runnable examples in examples/basic_langchain.py and examples/advanced_langchain.py.
Manual Tracing
from r4u.client import get_r4u_client
client = get_r4u_client()
# Create a trace manually
trace = await client.create_trace(
model="gpt-3.5-turbo",
messages=[
{"role": "user", "content": "Hello"},
{"role": "assistant", "content": "Hi there!"}
],
result="Hi there!",
started_at="2024-01-01T00:00:00Z",
completed_at="2024-01-01T00:00:01Z"
)
For a complete walk-through that includes tool definitions, tool call messages, and the final tool-assisted response, check out examples/tool_calls_example.py. The example issues real OpenAI Chat Completions requests (requiring OPENAI_API_KEY) and performs the multi-turn loop that fulfils the tool invocation before asking for the final assistant answer.
Features
- Automatic LLM Tracing: Wrap your existing LLM clients to automatically create traces
- Call Path Tracking: Automatically captures where LLM calls originate from in your code
- OpenAI & LangChain Integrations: Automatic tracing for OpenAI SDK calls and LangChain runnables via callbacks
- Async Support: Full async/await support
- Error Tracking: Automatic error capture and reporting
- Minimal Overhead: Lightweight wrapper with minimal performance impact
Call Path Tracking
The SDK automatically tracks where each LLM call originates from in your codebase:
# In src/app/chatbot.py
def process_query(user_input):
response = traced_client.chat.completions.create(
model="gpt-3.5-turbo",
messages=[{"role": "user", "content": user_input}]
)
return response
# The trace will include: path="src/app/chatbot.py::process_query->create"
For nested function calls, the full call chain is captured:
# The trace will show: "src/main.py::main->handle_request->process_query->create"
def main():
handle_request()
def handle_request():
process_query("Hello")
def process_query(text):
traced_client.chat.completions.create(...)
Development
See DEVELOPMENT.md for development setup and guidelines.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file r4u-0.1.0.tar.gz.
File metadata
- Download URL: r4u-0.1.0.tar.gz
- Upload date:
- Size: 184.2 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.9.19
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
73927983777e189a79a55e75c93326d2789126b88c3ba07c2095f93b9027bc01
|
|
| MD5 |
d4edf3ad65450dbebaf71705d79e89b8
|
|
| BLAKE2b-256 |
dbe44dbdcc1da5decead5aae271d41135353918881ddcbb0e16ee8d4e36cb4c4
|
File details
Details for the file r4u-0.1.0-py3-none-any.whl.
File metadata
- Download URL: r4u-0.1.0-py3-none-any.whl
- Upload date:
- Size: 24.2 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.9.19
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
be8e69698e5537dd71510ade64346759a3161d1d3d66e4f3d8db3461a527c3db
|
|
| MD5 |
4fe542af3d86eb0deeda9c1785bb5135
|
|
| BLAKE2b-256 |
d1bc40cfe1a0deee0569d94347740d3ab5ea4f12b5f565f37432bf7d6e9ee2b5
|