AI Agent Observability Platform - Track CrewAI, LangChain, LangGraph, and more
Project description
Visibe SDK for Python
Observability for AI agents. Track costs, performance, and errors across your entire AI stack — whether you're using CrewAI, LangChain, LangGraph, AutoGen, or direct OpenAI calls.
📦 Getting Started
1. Create an account
Sign up at app.visibe.ai and create a project.
2. Get an API key
In your project, go to Settings → API Keys and generate a new key. It will look like sk_live_....
3. Install the SDK
pip install visibe
4. Set your API key
export VISIBE_API_KEY=sk_live_your_api_key_here
Or pass it directly in code:
visibe.init(api_key="sk_live_your_api_key_here")
Or load it from a .env file using python-dotenv:
pip install python-dotenv
from dotenv import load_dotenv
load_dotenv() # loads VISIBE_API_KEY from .env
import visibe
visibe.init()
5. Instrument your app
import visibe
visibe.init()
That's it. Every OpenAI, LangChain, LangGraph, CrewAI, AutoGen, and Bedrock client created after this call is automatically traced — no other code changes needed.
🧩 Integrations
| Framework | Auto (visibe.init()) |
Manual |
|---|---|---|
| OpenAI | ✅ | ✅ |
| LangChain | ✅ | ✅ |
| LangGraph | ✅ | ✅ |
| CrewAI | ✅ | ✅ |
| AutoGen | ✅ | ✅ |
| AWS Bedrock | ✅ | ✅ |
Also works with OpenAI-compatible providers: Azure OpenAI, Groq, Together.ai, DeepSeek, and others.
OpenAI
import visibe
from openai import OpenAI
visibe.init()
client = OpenAI()
response = client.chat.completions.create(
model="gpt-4o-mini",
messages=[{"role": "user", "content": "Hello!"}]
)
# Automatically traced — cost, tokens, duration, and content captured.
LangChain / LangGraph
import visibe
from langgraph.prebuilt import create_react_agent
from langchain_openai import ChatOpenAI
visibe.init()
llm = ChatOpenAI(model="gpt-4o-mini")
graph = create_react_agent(llm, tools)
result = graph.invoke({"messages": [("user", "Your prompt here")]})
# Automatically traced — agent steps, LLM calls, and tool calls captured.
Dynamic pipe chains (prompt | llm | parser) are also automatically traced. Nested sub-graphs are tracked with hierarchical agent names.
CrewAI
import visibe
from crewai import Agent, Task, Crew
visibe.init()
architect = Agent(role="Plot Architect", goal="Design mystery plots", backstory="...")
designer = Agent(role="Character Designer", goal="Create characters", backstory="...")
narrator = Agent(role="Narrator", goal="Write the story", backstory="...")
task1 = Task(description="Create a plot outline", agent=architect, expected_output="...")
task2 = Task(description="Design characters", agent=designer, expected_output="...", context=[task1])
task3 = Task(description="Write the story", agent=narrator, expected_output="...", context=[task1, task2])
crew = Crew(agents=[architect, designer, narrator], tasks=[task1, task2, task3])
result = crew.kickoff()
# Single trace with all agents, LLM calls, and per-task cost breakdown.
Training and testing runs (crew.train(), crew.test()) are traced too.
AutoGen
import visibe
from autogen_ext.models.openai import OpenAIChatCompletionClient
from autogen_agentchat.agents import AssistantAgent
visibe.init()
model_client = OpenAIChatCompletionClient(model="gpt-4o-mini")
assistant = AssistantAgent("assistant", model_client=model_client)
result = await assistant.run(task="Help me with this task")
# Automatically traced.
AWS Bedrock
import visibe
import boto3
visibe.init()
bedrock = boto3.client("bedrock-runtime", region_name="us-east-1")
response = bedrock.converse(
modelId="anthropic.claude-3-haiku-20240307-v1:0",
messages=[{"role": "user", "content": [{"text": "Hello!"}]}]
)
# Automatically traced.
Supports converse, converse_stream, invoke_model, and invoke_model_with_response_stream. Works with all models available via Bedrock — Claude, Nova, Llama, Mistral, and more.
⚙️ Configuration
import visibe
visibe.init(
api_key="sk_live_abc123", # or set VISIBE_API_KEY env var
frameworks=["openai", "langgraph"], # limit to specific frameworks
content_limit=500, # max chars for LLM content in traces
debug=True, # enable debug logging
)
Environment Variables
| Variable | Description | Default |
|---|---|---|
VISIBE_API_KEY |
Your API key (required) | — |
VISIBE_API_URL |
Override API endpoint | https://api.visibe.ai |
VISIBE_AUTO_INSTRUMENT |
Comma-separated frameworks to auto-instrument | All detected |
VISIBE_CONTENT_LIMIT |
Max chars for LLM/tool content in spans | 1000 |
VISIBE_DEBUG |
Enable debug logging (1 to enable) |
0 |
📊 What Gets Tracked
| Metric | Description |
|---|---|
| Cost | Total spend + per-agent and per-task cost breakdown |
| Tokens | Input/output tokens per LLM call |
| Duration | Total time + time per step |
| Tools | Which tools were used, duration, success/failure |
| Errors | When and where things failed |
| Spans | Full execution timeline with LLM calls, tool calls, and agent events |
🔧 Manual Instrumentation
For cases where you need explicit control — instrumenting a specific client, grouping calls into a named trace, or using Visibe without init().
Instrument a specific client
from visibe import Visibe
tracer = Visibe(api_key="sk_live_abc123")
tracer.instrument(graph, name="my-agent")
result = graph.invoke({"messages": [("user", "Hello")]})
Group multiple calls into one trace
from visibe import Visibe
tracer = Visibe()
with tracer.track(client, name="my-conversation"):
r1 = client.chat.completions.create(model="gpt-4o-mini", messages=[...])
r2 = client.chat.completions.create(model="gpt-4o-mini", messages=[...])
# Both calls sent as one grouped trace.
Remove instrumentation
tracer.uninstrument(client)
# Or use as a context manager for automatic cleanup:
with tracer.instrument(graph, name="my-agent"):
graph.invoke(...)
# Instrumentation removed automatically on exit.
📚 Documentation
- OpenAI integration
- LangChain integration
- CrewAI integration
- AutoGen integration
- AWS Bedrock integration
🔗 Resources
- visibe.ai — Product website
- app.visibe.ai — Dashboard (sign up, manage API keys, view traces)
- PyPI Package — Latest version on PyPI
📃 License
MIT — see LICENSE for details.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file visibe-0.1.6.tar.gz.
File metadata
- Download URL: visibe-0.1.6.tar.gz
- Upload date:
- Size: 83.2 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
044dd4b895903b84f91e9e5358822ba30a112730677a16e870fba5b220f37e52
|
|
| MD5 |
c0bfa3f6883f127c07018c3d5bb1390a
|
|
| BLAKE2b-256 |
6a982ad9de67c2f69eec92df477e77016771d8c03922c25c2ee652f233d47fea
|
Provenance
The following attestation bundles were made for visibe-0.1.6.tar.gz:
Publisher:
release.yml on Project140/visibe-python
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
visibe-0.1.6.tar.gz -
Subject digest:
044dd4b895903b84f91e9e5358822ba30a112730677a16e870fba5b220f37e52 - Sigstore transparency entry: 992603559
- Sigstore integration time:
-
Permalink:
Project140/visibe-python@f1cfa3093302f1a49a51eba53052634fd4255f97 -
Branch / Tag:
refs/heads/main - Owner: https://github.com/Project140
-
Access:
private
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
release.yml@f1cfa3093302f1a49a51eba53052634fd4255f97 -
Trigger Event:
push
-
Statement type:
File details
Details for the file visibe-0.1.6-py3-none-any.whl.
File metadata
- Download URL: visibe-0.1.6-py3-none-any.whl
- Upload date:
- Size: 70.0 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
2db6839b904311ab711772f66e33587b333d63bb535581192369b7bb52f819df
|
|
| MD5 |
2658beb4e9e24dafc8a750170dcb9ff0
|
|
| BLAKE2b-256 |
bb6ef8c4037e69f3409f5abaed5c173658ca2f0460022fcd2ca908a9c08874c1
|
Provenance
The following attestation bundles were made for visibe-0.1.6-py3-none-any.whl:
Publisher:
release.yml on Project140/visibe-python
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
visibe-0.1.6-py3-none-any.whl -
Subject digest:
2db6839b904311ab711772f66e33587b333d63bb535581192369b7bb52f819df - Sigstore transparency entry: 992603562
- Sigstore integration time:
-
Permalink:
Project140/visibe-python@f1cfa3093302f1a49a51eba53052634fd4255f97 -
Branch / Tag:
refs/heads/main - Owner: https://github.com/Project140
-
Access:
private
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
release.yml@f1cfa3093302f1a49a51eba53052634fd4255f97 -
Trigger Event:
push
-
Statement type: