Skip to main content

LangChain callback handler for automatic usage tracking and billing with AgentBill

Project description

AgentBill LangChain Integration

Automatic usage tracking and billing for LangChain applications.

PyPI version License: MIT

Installation

Install via pip:

pip install agentbill-langchain

With OpenAI support:

pip install agentbill-langchain[openai]

With Anthropic support:

pip install agentbill-langchain[anthropic]

Quick Start

from agentbill_langchain import AgentBillCallback
from langchain_openai import ChatOpenAI
from langchain.chains import LLMChain
from langchain.prompts import PromptTemplate

# 1. Initialize AgentBill callback
callback = AgentBillCallback(
    api_key="agb_your_api_key_here",  # Get from AgentBill dashboard
    base_url="https://bgwyprqxtdreuutzpbgw.supabase.co",
    customer_id="customer-123",
    debug=True
)

# 2. Create LangChain chain with callback
llm = ChatOpenAI(model="gpt-4o-mini")
prompt = PromptTemplate.from_template("Tell me a joke about {topic}")
chain = LLMChain(llm=llm, prompt=prompt)

# 3. Run - everything is auto-tracked!
result = chain.invoke(
    {"topic": "programming"},
    config={"callbacks": [callback]}
)

print(result["text"])

# ✅ Automatically captured:
# - Prompt text (hashed for privacy)
# - Model name (gpt-4o-mini)
# - Provider (openai)
# - Token usage (prompt + completion)
# - Latency (ms)
# - Costs (calculated automatically)

Features

  • Zero-config instrumentation - Just add the callback
  • Automatic token tracking - Captures all LLM calls
  • Multi-provider support - OpenAI, Anthropic, any LangChain LLM
  • Chain tracking - Tracks entire chain executions
  • Cost calculation - Auto-calculates costs per model
  • Prompt profitability - Compare costs vs revenue
  • OpenTelemetry compatible - Standard observability

Advanced Usage

Track Custom Revenue

# Track revenue for profitability analysis
callback.track_revenue(
    event_name="chat_completion",
    revenue=0.50,  # What you charged the customer
    metadata={"subscription_tier": "pro"}
)

Use with Agents

from langchain.agents import initialize_agent, load_tools

tools = load_tools(["serpapi", "llm-math"], llm=llm)
agent = initialize_agent(
    tools,
    llm,
    agent="zero-shot-react-description",
    callbacks=[callback]  # Add callback here
)

# All agent steps auto-tracked!
response = agent.run("What is 25% of 300?")

Use with Sequential Chains

from langchain.chains import SimpleSequentialChain

# All chain steps tracked automatically
overall_chain = SimpleSequentialChain(
    chains=[chain1, chain2, chain3],
    callbacks=[callback]
)

result = overall_chain.run(input_text)

Configuration

callback = AgentBillCallback(
    api_key="agb_...",           # Required - get from dashboard
    base_url="https://...",      # Required - your AgentBill instance
    customer_id="customer-123",  # Optional - for multi-tenant apps
    account_id="account-456",    # Optional - for account-level tracking
    debug=True,                  # Optional - enable debug logging
    batch_size=10,               # Optional - batch signals before sending
    flush_interval=5.0           # Optional - flush interval in seconds
)

How It Works

The callback hooks into LangChain's lifecycle:

  1. on_llm_start - Captures prompt, model, provider
  2. on_llm_end - Captures tokens, latency, response
  3. on_llm_error - Captures errors and retries
  4. on_chain_start - Tracks chain execution start
  5. on_chain_end - Tracks chain completion

All data is sent to AgentBill via the record-signals API endpoint with proper authentication.

Supported Models

Auto-cost calculation for:

  • OpenAI: GPT-4, GPT-4o, GPT-3.5-turbo, etc.
  • Anthropic: Claude 3.5 Sonnet, Claude 3 Opus, etc.
  • Any LangChain-compatible LLM

Troubleshooting

Not seeing data in dashboard?

  1. Check API key is correct
  2. Enable debug=True to see logs
  3. Verify base_url matches your instance
  4. Check network connectivity to AgentBill

Token counts are zero?

  • Some LLMs don't return token usage
  • Callback will estimate based on response length
  • OpenAI/Anthropic provide accurate counts

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

agentbill_py_langchain-6.8.5.tar.gz (9.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

agentbill_py_langchain-6.8.5-py3-none-any.whl (8.8 kB view details)

Uploaded Python 3

File details

Details for the file agentbill_py_langchain-6.8.5.tar.gz.

File metadata

  • Download URL: agentbill_py_langchain-6.8.5.tar.gz
  • Upload date:
  • Size: 9.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.11.14

File hashes

Hashes for agentbill_py_langchain-6.8.5.tar.gz
Algorithm Hash digest
SHA256 b2cb4f155b1739906591b24faf3a3babb2c7235e4a36b89c33e892c9a95a0dc2
MD5 31c9fb1cd66811c040bbb39c3b37f6ab
BLAKE2b-256 0ee52b05ae1635c86d134f70b7a9581c6ad90f2b4ca366c66cff4984d880a771

See more details on using hashes here.

File details

Details for the file agentbill_py_langchain-6.8.5-py3-none-any.whl.

File metadata

File hashes

Hashes for agentbill_py_langchain-6.8.5-py3-none-any.whl
Algorithm Hash digest
SHA256 0880fd871df1b66b123326c25b45ee755a809a83f25de07e3f10bbb70e32f837
MD5 bcb82ae7c3bb74377a2b81b4480369e1
BLAKE2b-256 0388f92d01835cae41a23b11cc989038f1db473ccc8d0367a26214795a086ad6

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page