Skip to main content

LangChain callback handler for automatic usage tracking and billing with AgentBill

Project description

AgentBill LangChain Integration

Automatic usage tracking and billing for LangChain applications.

PyPI version License: MIT

Installation

Install via pip:

pip install agentbill-langchain

With OpenAI support:

pip install agentbill-langchain[openai]

With Anthropic support:

pip install agentbill-langchain[anthropic]

Quick Start

from agentbill_langchain import AgentBillCallback
from langchain_openai import ChatOpenAI
from langchain.chains import LLMChain
from langchain.prompts import PromptTemplate

# 1. Initialize AgentBill callback
callback = AgentBillCallback(
    api_key="agb_your_api_key_here",  # Get from AgentBill dashboard
    base_url="https://bgwyprqxtdreuutzpbgw.supabase.co",
    customer_id="customer-123",
    debug=True
)

# 2. Create LangChain chain with callback
llm = ChatOpenAI(model="gpt-4o-mini")
prompt = PromptTemplate.from_template("Tell me a joke about {topic}")
chain = LLMChain(llm=llm, prompt=prompt)

# 3. Run - everything is auto-tracked!
result = chain.invoke(
    {"topic": "programming"},
    config={"callbacks": [callback]}
)

print(result["text"])

# ✅ Automatically captured:
# - Prompt text (hashed for privacy)
# - Model name (gpt-4o-mini)
# - Provider (openai)
# - Token usage (prompt + completion)
# - Latency (ms)
# - Costs (calculated automatically)

Features

  • Zero-config instrumentation - Just add the callback
  • Automatic token tracking - Captures all LLM calls
  • Multi-provider support - OpenAI, Anthropic, any LangChain LLM
  • Chain tracking - Tracks entire chain executions
  • Cost calculation - Auto-calculates costs per model
  • Prompt profitability - Compare costs vs revenue
  • OpenTelemetry compatible - Standard observability

Advanced Usage

Track Custom Revenue

# Track revenue for profitability analysis
callback.track_revenue(
    event_name="chat_completion",
    revenue=0.50,  # What you charged the customer
    metadata={"subscription_tier": "pro"}
)

Use with Agents

from langchain.agents import initialize_agent, load_tools

tools = load_tools(["serpapi", "llm-math"], llm=llm)
agent = initialize_agent(
    tools,
    llm,
    agent="zero-shot-react-description",
    callbacks=[callback]  # Add callback here
)

# All agent steps auto-tracked!
response = agent.run("What is 25% of 300?")

Use with Sequential Chains

from langchain.chains import SimpleSequentialChain

# All chain steps tracked automatically
overall_chain = SimpleSequentialChain(
    chains=[chain1, chain2, chain3],
    callbacks=[callback]
)

result = overall_chain.run(input_text)

Configuration

callback = AgentBillCallback(
    api_key="agb_...",           # Required - get from dashboard
    base_url="https://...",      # Required - your AgentBill instance
    customer_id="customer-123",  # Optional - for multi-tenant apps
    account_id="account-456",    # Optional - for account-level tracking
    debug=True,                  # Optional - enable debug logging
    batch_size=10,               # Optional - batch signals before sending
    flush_interval=5.0           # Optional - flush interval in seconds
)

How It Works

The callback hooks into LangChain's lifecycle:

  1. on_llm_start - Captures prompt, model, provider
  2. on_llm_end - Captures tokens, latency, response
  3. on_llm_error - Captures errors and retries
  4. on_chain_start - Tracks chain execution start
  5. on_chain_end - Tracks chain completion

All data is sent to AgentBill via the record-signals API endpoint with proper authentication.

Supported Models

Auto-cost calculation for:

  • OpenAI: GPT-4, GPT-4o, GPT-3.5-turbo, etc.
  • Anthropic: Claude 3.5 Sonnet, Claude 3 Opus, etc.
  • Any LangChain-compatible LLM

Troubleshooting

Not seeing data in dashboard?

  1. Check API key is correct
  2. Enable debug=True to see logs
  3. Verify base_url matches your instance
  4. Check network connectivity to AgentBill

Token counts are zero?

  • Some LLMs don't return token usage
  • Callback will estimate based on response length
  • OpenAI/Anthropic provide accurate counts

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

agentbill_py_langchain-7.16.0.tar.gz (20.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

agentbill_py_langchain-7.16.0-py3-none-any.whl (20.9 kB view details)

Uploaded Python 3

File details

Details for the file agentbill_py_langchain-7.16.0.tar.gz.

File metadata

  • Download URL: agentbill_py_langchain-7.16.0.tar.gz
  • Upload date:
  • Size: 20.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.11.14

File hashes

Hashes for agentbill_py_langchain-7.16.0.tar.gz
Algorithm Hash digest
SHA256 affcbbd932993b59abc6a44793748cc5d65b46c97c8d56e2c62d94b2822b6752
MD5 c5b683a4113ac875f628fbaab46c0467
BLAKE2b-256 3b20572403957a956ef1fa5621799a230933c832d495140063c5c22aa0fb44ff

See more details on using hashes here.

File details

Details for the file agentbill_py_langchain-7.16.0-py3-none-any.whl.

File metadata

File hashes

Hashes for agentbill_py_langchain-7.16.0-py3-none-any.whl
Algorithm Hash digest
SHA256 ea641c35c8abd8e1ca3da7f5a41d6e05f2dc27e7d6df4e666c51e7e7aa6461b8
MD5 ea6ca718f158b22fdbb9a2a992f513b3
BLAKE2b-256 eff9be9cec5d9fbc20dfd79d08cd91bca6e9128181e7f7b13799f44e29352ad7

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page