Auto-optimize LLM prompt caching. Save 60-90% on Claude, GPT & Gemini API costs.
Project description
cachellm
Auto-optimize LLM prompt caching. One line of code, 60-90% savings on your API bill.
Install
pip install cachellm
Quick Start
Anthropic (Claude) — saves up to 90%
from anthropic import Anthropic
from cachellm import optimize_anthropic
client = optimize_anthropic(Anthropic())
response = client.messages.create(
model="claude-sonnet-4-20250514",
max_tokens=1024,
system="You are a helpful cooking assistant...",
messages=[{"role": "user", "content": "How do I make biryani?"}],
)
client.print_stats()
OpenAI (GPT) — saves up to 50%
from openai import OpenAI
from cachellm import optimize_openai
client = optimize_openai(OpenAI())
response = client.chat.completions.create(
model="gpt-4o",
messages=[
{"role": "system", "content": "You are a helpful assistant..."},
{"role": "user", "content": "Hello"},
],
)
client.print_stats()
Configuration
from cachellm import optimize_anthropic
from cachellm.types import AnthropicCacheOptions
client = optimize_anthropic(Anthropic(), AnthropicCacheOptions(
strategy="auto",
max_breakpoints=4,
ttl="5m",
min_tokens=1024,
debug=False,
))
Standalone Analysis
from cachellm import PromptAnalyzer
analyzer = PromptAnalyzer()
analysis = analyzer.analyze_anthropic_params({
"system": "Your long system prompt here...",
"tools": [{"name": "search", "description": "Search the web", "input_schema": {"type": "object"}}],
"messages": [{"role": "user", "content": "Hello"}],
})
print(f"Cacheable: {analysis.cacheable_tokens} tokens")
print(f"Estimated savings: ~{analysis.estimated_savings_percent}%")
Requirements
- Python >= 3.9
- Zero dependencies (provider SDKs are optional)
License
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file cachellm_py-0.2.0.tar.gz.
File metadata
- Download URL: cachellm_py-0.2.0.tar.gz
- Upload date:
- Size: 14.9 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
564f457da0fb6d099aab7b08f6cfbcd9b1a8d88680a566e6455e510278f6b879
|
|
| MD5 |
f089ca02e7cd82590f782447372f6e40
|
|
| BLAKE2b-256 |
d854b38c1e59ec0f8b12e0886cdb4b4c6907147d920ccec83515a26fd43ab46d
|
Provenance
The following attestation bundles were made for cachellm_py-0.2.0.tar.gz:
Publisher:
pypi-release.yml on sahilempire/cachellm
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
cachellm_py-0.2.0.tar.gz -
Subject digest:
564f457da0fb6d099aab7b08f6cfbcd9b1a8d88680a566e6455e510278f6b879 - Sigstore transparency entry: 1367823961
- Sigstore integration time:
-
Permalink:
sahilempire/cachellm@5e8e8175d85210ef7597d7a90bcdb579fb7888d5 -
Branch / Tag:
refs/tags/v0.2.1 - Owner: https://github.com/sahilempire
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
pypi-release.yml@5e8e8175d85210ef7597d7a90bcdb579fb7888d5 -
Trigger Event:
push
-
Statement type:
File details
Details for the file cachellm_py-0.2.0-py3-none-any.whl.
File metadata
- Download URL: cachellm_py-0.2.0-py3-none-any.whl
- Upload date:
- Size: 18.6 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
0754d60caea24a9d5a8d3138638b4ff39fa1734513e3484282e9d248dc9bc27a
|
|
| MD5 |
fc39954db539101fdfdf08bbb2221927
|
|
| BLAKE2b-256 |
c611f92b166c121a7bbf6c2fbc3efdde4c661a6d8b2ab29e9ccbbce28f2e10ca
|
Provenance
The following attestation bundles were made for cachellm_py-0.2.0-py3-none-any.whl:
Publisher:
pypi-release.yml on sahilempire/cachellm
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
cachellm_py-0.2.0-py3-none-any.whl -
Subject digest:
0754d60caea24a9d5a8d3138638b4ff39fa1734513e3484282e9d248dc9bc27a - Sigstore transparency entry: 1367823976
- Sigstore integration time:
-
Permalink:
sahilempire/cachellm@5e8e8175d85210ef7597d7a90bcdb579fb7888d5 -
Branch / Tag:
refs/tags/v0.2.1 - Owner: https://github.com/sahilempire
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
pypi-release.yml@5e8e8175d85210ef7597d7a90bcdb579fb7888d5 -
Trigger Event:
push
-
Statement type: