Skip to main content

LangChain callback handler for automatic usage tracking and billing with AgentBill

Project description

AgentBill LangChain Integration

Automatic usage tracking and billing for LangChain applications.

PyPI version License: MIT

Installation

Install via pip:

pip install agentbill-langchain

With OpenAI support:

pip install agentbill-langchain[openai]

With Anthropic support:

pip install agentbill-langchain[anthropic]

Quick Start

from agentbill_langchain import AgentBillCallback
from langchain_openai import ChatOpenAI
from langchain.chains import LLMChain
from langchain.prompts import PromptTemplate

# 1. Initialize AgentBill callback
callback = AgentBillCallback(
    api_key="agb_your_api_key_here",  # Get from AgentBill dashboard
    base_url="https://bgwyprqxtdreuutzpbgw.supabase.co",
    customer_id="customer-123",
    debug=True
)

# 2. Create LangChain chain with callback
llm = ChatOpenAI(model="gpt-4o-mini")
prompt = PromptTemplate.from_template("Tell me a joke about {topic}")
chain = LLMChain(llm=llm, prompt=prompt)

# 3. Run - everything is auto-tracked!
result = chain.invoke(
    {"topic": "programming"},
    config={"callbacks": [callback]}
)

print(result["text"])

# ✅ Automatically captured:
# - Prompt text (hashed for privacy)
# - Model name (gpt-4o-mini)
# - Provider (openai)
# - Token usage (prompt + completion)
# - Latency (ms)
# - Costs (calculated automatically)

Features

  • Zero-config instrumentation - Just add the callback
  • Automatic token tracking - Captures all LLM calls
  • Multi-provider support - OpenAI, Anthropic, any LangChain LLM
  • Chain tracking - Tracks entire chain executions
  • Cost calculation - Auto-calculates costs per model
  • Prompt profitability - Compare costs vs revenue
  • OpenTelemetry compatible - Standard observability

Advanced Usage

Track Custom Revenue

# Track revenue for profitability analysis
callback.track_revenue(
    event_name="chat_completion",
    revenue=0.50,  # What you charged the customer
    metadata={"subscription_tier": "pro"}
)

Use with Agents

from langchain.agents import initialize_agent, load_tools

tools = load_tools(["serpapi", "llm-math"], llm=llm)
agent = initialize_agent(
    tools,
    llm,
    agent="zero-shot-react-description",
    callbacks=[callback]  # Add callback here
)

# All agent steps auto-tracked!
response = agent.run("What is 25% of 300?")

Use with Sequential Chains

from langchain.chains import SimpleSequentialChain

# All chain steps tracked automatically
overall_chain = SimpleSequentialChain(
    chains=[chain1, chain2, chain3],
    callbacks=[callback]
)

result = overall_chain.run(input_text)

Configuration

callback = AgentBillCallback(
    api_key="agb_...",           # Required - get from dashboard
    base_url="https://...",      # Required - your AgentBill instance
    customer_id="customer-123",  # Optional - for multi-tenant apps
    account_id="account-456",    # Optional - for account-level tracking
    debug=True,                  # Optional - enable debug logging
    batch_size=10,               # Optional - batch signals before sending
    flush_interval=5.0           # Optional - flush interval in seconds
)

How It Works

The callback hooks into LangChain's lifecycle:

  1. on_llm_start - Captures prompt, model, provider
  2. on_llm_end - Captures tokens, latency, response
  3. on_llm_error - Captures errors and retries
  4. on_chain_start - Tracks chain execution start
  5. on_chain_end - Tracks chain completion

All data is sent to AgentBill via the record-signals API endpoint with proper authentication.

Supported Models

Auto-cost calculation for:

  • OpenAI: GPT-4, GPT-4o, GPT-3.5-turbo, etc.
  • Anthropic: Claude 3.5 Sonnet, Claude 3 Opus, etc.
  • Any LangChain-compatible LLM

Troubleshooting

Not seeing data in dashboard?

  1. Check API key is correct
  2. Enable debug=True to see logs
  3. Verify base_url matches your instance
  4. Check network connectivity to AgentBill

Token counts are zero?

  • Some LLMs don't return token usage
  • Callback will estimate based on response length
  • OpenAI/Anthropic provide accurate counts

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

agentbill_py_langchain-3.0.0.tar.gz (7.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

agentbill_py_langchain-3.0.0-py3-none-any.whl (7.5 kB view details)

Uploaded Python 3

File details

Details for the file agentbill_py_langchain-3.0.0.tar.gz.

File metadata

  • Download URL: agentbill_py_langchain-3.0.0.tar.gz
  • Upload date:
  • Size: 7.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.11.14

File hashes

Hashes for agentbill_py_langchain-3.0.0.tar.gz
Algorithm Hash digest
SHA256 d9741fd0dfd7f787fdd7c0abda56f83ee8140d2f2af7f293083dfaf24fea3ecb
MD5 cc2de5884078dbe49e7e7fc541cd2d30
BLAKE2b-256 154fb9acc4f68504bd08883ef3f62824e15e0b050d66e0233f613367eb635e6f

See more details on using hashes here.

File details

Details for the file agentbill_py_langchain-3.0.0-py3-none-any.whl.

File metadata

File hashes

Hashes for agentbill_py_langchain-3.0.0-py3-none-any.whl
Algorithm Hash digest
SHA256 61b67620d165e1500d35f429fe560861d3b6f6b2d76d87408886e355df480a9c
MD5 5ee20391e2ac60379bbb285199891796
BLAKE2b-256 0b6116f78ab300baca122c8215dceecff920a15b81b6935ffe25ef7201906c7d

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page