AI infrastructure monitoring for tracking usage, cost, and performance across OpenAI, Claude, ElevenLabs, Pinecone, and more. Supports SQLite (default) and PostgreSQL databases.
Project description
AI cost monitoring for Python. Two lines of code. Thirteen providers.
Quickstart • Providers • Features • Middleware • Config • Contributing
The Problem
You're shipping AI features. Costs are invisible until the invoice hits.
Month 1 $12 "No big deal."
Month 3 $480 "Wait, what?"
Month 5 $2,100 "Which call is doing this??"
StackSense wraps your existing AI clients and tracks every call — tokens, latency, cost — with zero config and zero code changes to your business logic.
Quickstart
pip install stacksense
from stacksense import StackSense
import openai
ss = StackSense()
client = ss.monitor(openai.OpenAI())
# Use client exactly as before — every call is now tracked
response = client.chat.completions.create(
model="gpt-4o",
messages=[{"role": "user", "content": "Hello!"}]
)
print(ss.get_metrics())
# {'total_calls': 1, 'total_tokens': 28, 'total_cost': 0.0004, ...}
That's it. No dashboards to configure. No agents to deploy. Just monitor() and go.
Supported Providers
Pass any supported client to ss.monitor() — the provider is auto-detected.
| OpenAI GPT-4o • o1 • o3 • Embeddings |
Anthropic Opus 4 • Sonnet 4 • Haiku |
Google Gemini 2.0 Flash • 1.5 Pro |
Mistral Large • Small • Codestral |
| Cohere Command R/R+ • Embed v4 |
DeepSeek Chat • Reasoner |
AI21 Labs Jamba 1.5 Large/Mini |
Together AI Llama 3.1 • Mixtral |
| Groq Llama 3.3 • Mixtral • Gemma2 |
Perplexity Sonar Pro • Reasoning |
Replicate Llama • SDXL • any model |
ElevenLabs Voice models • per-character |
| Pinecone Vector ops • per-query |
More coming soon — request a provider | ||
Features
Multi-Provider Cost Breakdown
Track spend across providers from a single StackSense instance.
ss = StackSense()
oai = ss.monitor(openai.OpenAI())
claude = ss.monitor(anthropic.Anthropic())
oai.chat.completions.create(model="gpt-4o", messages=[...])
claude.messages.create(model="claude-sonnet-4-20250514", messages=[...])
ss.get_cost_breakdown()
# {'openai': 0.003, 'anthropic': 0.002}
Decorator API
Track any function without wrapping a client:
import stacksense
@stacksense.track(provider="openai", model="gpt-4o")
def generate(prompt):
return client.chat.completions.create(
model="gpt-4o",
messages=[{"role": "user", "content": prompt}]
)
Works with async functions too.
Alerts & Webhooks
Get notified when costs spike.
from stacksense.alerts import AlertManager, AlertRule
alerts = AlertManager(tracker=ss.tracker)
alerts.add_rule(AlertRule(
name="Cost spike",
metric="cost",
threshold=5.0,
window="1h",
))
alerts.add_webhook("https://hooks.slack.com/services/...")
alerts.check()
Export
from stacksense.exporters import Exporter
exporter = Exporter(ss.tracker)
exporter.to_csv("metrics.csv")
exporter.to_json("metrics.json")
Framework Middleware
Drop-in middleware for popular frameworks — automatically tracks all AI calls per request.
|
FastAPI from stacksense.middleware import (
FastAPIMiddleware
)
app.add_middleware(
FastAPIMiddleware,
stacksense=ss
)
|
Flask from stacksense.middleware import (
FlaskMiddleware
)
FlaskMiddleware(app, stacksense=ss)
|
Django # settings.py
MIDDLEWARE = [
...,
'stacksense.middleware'
'.DjangoMiddleware',
]
|
CLI
stacksense status # View current metrics
stacksense dashboard # Launch web dashboard
stacksense export csv -o out.csv
stacksense db init # Initialize database
Configuration
SQLite by default — zero config. PostgreSQL for production:
pip install stacksense[postgresql]
# Environment variables
STACKSENSE_PROJECT_ID=my-project
STACKSENSE_ENABLE_DB=true
STACKSENSE_DB_URL=postgresql://user:pass@host:5432/stacksense
STACKSENSE_ENVIRONMENT=production
STACKSENSE_DEBUG=false
Contributing
git clone https://github.com/Abdulkvng/stacksense.git
cd stacksense
pip install -e ".[dev]"
pytest tests/ -v
PRs welcome. Please open an issue first for large changes.
MIT License © 2025 StackSense Contributors
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file stacksense-0.1.0.tar.gz.
File metadata
- Download URL: stacksense-0.1.0.tar.gz
- Upload date:
- Size: 85.5 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.13.9
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
de605c7fb9ab9abe79694e8b5ae4374eee01b69ef335f081d5c6b82d05a977da
|
|
| MD5 |
3b0745ac7d98b763db7dbf8b33091847
|
|
| BLAKE2b-256 |
e94b6df61381c13b5a3727f48e9f0c29c621194b778ac1eea9f4f886e6a1263c
|
File details
Details for the file stacksense-0.1.0-py3-none-any.whl.
File metadata
- Download URL: stacksense-0.1.0-py3-none-any.whl
- Upload date:
- Size: 93.4 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.13.9
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
721dc71de66e3ceca94577c3a11c10dd427f0048d0aef9f66b102a70e4bc4830
|
|
| MD5 |
643b996aa8327babcec8c447181e4794
|
|
| BLAKE2b-256 |
7673c6a9da2d13635cbef36863c9748c90685216b3fd809649ad688b0142d810
|