LLM cost tracking and budget enforcement. Zero config — with budget(max_usd=1.00): run_agent(). Works with LangGraph, CrewAI, raw OpenAI/Anthropic.
Project description
with budget(max_usd=1.00):
run_my_agent() # raises BudgetExceededError if spend exceeds $1.00
I spent $47 debugging a LangGraph retry loop. The agent kept failing, LangGraph kept retrying, and OpenAI kept charging — all while I slept. I built shekel so you don't have to learn that lesson yourself.
Install
pip install shekel[openai] # OpenAI only
pip install shekel[anthropic] # Anthropic only
pip install shekel[all] # Both SDKs
Advanced: pip install shekel installs the core with no SDK deps (track-only mode).
Quick start
from shekel import budget, BudgetExceededError
import openai
client = openai.OpenAI()
try:
with budget(max_usd=1.00, warn_at=0.8) as b:
response = client.chat.completions.create(
model="gpt-4o",
messages=[{"role": "user", "content": "Hello!"}],
)
print(f"Spent: ${b.spent:.4f} of ${b.limit:.2f}")
except BudgetExceededError as e:
print(e)
Track-only mode
No max_usd = track spend without enforcing a limit. Great for profiling agents.
with budget() as b:
run_my_agent()
print(f"That run cost: ${b.spent:.4f}")
How it works
- Monkey-patching: When you enter a
budget()context, shekel wrapsopenai.ChatCompletions.createandanthropic.Messages.createat the class level. Your code calls the real SDK — shekel intercepts the response, reads the token counts, and calculates cost. On context exit, original methods are restored. - ContextVar isolation: Each
budget()context tracks its own spend using Python'scontextvars.ContextVar. Two concurrent agent runs never share a budget counter, even in async or multi-threaded code. - Zero config: No API keys, no external services, no config files.
pip install+with budget(...)is all you need.
Model support
| Model | Input (per 1k) | Output (per 1k) |
|---|---|---|
| gpt-4o | $0.00250 | $0.01000 |
| gpt-4o-mini | $0.000150 | $0.000600 |
| o1 | $0.01500 | $0.06000 |
| o1-mini | $0.00300 | $0.01200 |
| gpt-3.5-turbo | $0.000500 | $0.001500 |
| claude-3-5-sonnet-20241022 | $0.00300 | $0.01500 |
| claude-3-haiku-20240307 | $0.000250 | $0.001250 |
| claude-3-opus-20240229 | $0.01500 | $0.07500 |
| gemini-1.5-flash | $0.0000750 | $0.000300 |
| gemini-1.5-pro | $0.00125 | $0.00500 |
Model not listed? Pass a price override:
with budget(max_usd=1.00, price_per_1k_tokens={"input": 0.001, "output": 0.003}):
run_my_agent()
Works with LangGraph, CrewAI, and any framework
shekel is framework-agnostic. It intercepts at the SDK level, so it works with anything that calls OpenAI or Anthropic under the hood:
# LangGraph
with budget(max_usd=2.00, warn_at=0.8) as b:
result = langgraph_app.invoke({"input": "..."})
print(f"Graph run cost: ${b.spent:.4f}")
# CrewAI
with budget(max_usd=5.00) as b:
crew.kickoff()
# Raw SDK
with budget(max_usd=0.50) as b:
for _ in range(100):
client.chat.completions.create(...) # stops when budget hit
License
MIT
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file shekel-0.1.0.tar.gz.
File metadata
- Download URL: shekel-0.1.0.tar.gz
- Upload date:
- Size: 14.5 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
2fbf3739e492879b168da40138fa3e79f2934216e9f1a9aa49cdd9fd59dcdf77
|
|
| MD5 |
f4535fc5b2b66c18bd2988efa129a8f5
|
|
| BLAKE2b-256 |
67d553dd30e05f8baf52aa88664f4b78bf555d65c529cae5e6fcf0bb37e27755
|
Provenance
The following attestation bundles were made for shekel-0.1.0.tar.gz:
Publisher:
publish.yml on arieradle/shekel
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
shekel-0.1.0.tar.gz -
Subject digest:
2fbf3739e492879b168da40138fa3e79f2934216e9f1a9aa49cdd9fd59dcdf77 - Sigstore transparency entry: 1059767554
- Sigstore integration time:
-
Permalink:
arieradle/shekel@e37c61efdddc8992dee8f6661766b7ca1cee63e2 -
Branch / Tag:
refs/tags/v0.1.0 - Owner: https://github.com/arieradle
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@e37c61efdddc8992dee8f6661766b7ca1cee63e2 -
Trigger Event:
push
-
Statement type:
File details
Details for the file shekel-0.1.0-py3-none-any.whl.
File metadata
- Download URL: shekel-0.1.0-py3-none-any.whl
- Upload date:
- Size: 10.3 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
a89f7cd9552b8a483667857376034e1a3818e20b151ff9e8939acffd02d94151
|
|
| MD5 |
18bf6ebc3832e89932a0c7b30ae6b434
|
|
| BLAKE2b-256 |
675f3ff17721dddb4743bfc3094f2e5edb86cc8e1a56ae7a35a7153dc6d38368
|
Provenance
The following attestation bundles were made for shekel-0.1.0-py3-none-any.whl:
Publisher:
publish.yml on arieradle/shekel
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
shekel-0.1.0-py3-none-any.whl -
Subject digest:
a89f7cd9552b8a483667857376034e1a3818e20b151ff9e8939acffd02d94151 - Sigstore transparency entry: 1059767556
- Sigstore integration time:
-
Permalink:
arieradle/shekel@e37c61efdddc8992dee8f6661766b7ca1cee63e2 -
Branch / Tag:
refs/tags/v0.1.0 - Owner: https://github.com/arieradle
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@e37c61efdddc8992dee8f6661766b7ca1cee63e2 -
Trigger Event:
push
-
Statement type: