Durable file-backed caching for JSON-like data with pluggable storage backends
Project description
PyperCache Docs
PyperCache is a durable, file-backed cache for JSON-like Python data.
Features
- API wrapper base class: build synchronous
requestsclients with URL joining, optional caching, response decoding, file downloads, SSE parsing, and typed response casting - File-backed storage backends: choose Pickle, JSON, chunked manifest, or SQLite storage by file extension (
.pkl,.json,.manifest,.db) - Expiry-aware cache records: store records with optional TTLs, check freshness, and refetch stale
ApiWrapperGET/JSON responses instead of serving expired data - Typed API models: decorate classes with
@apimodelfor dict constructors, nested hydration, raw field aliases, timestamp parsing, lazy fields, and optional validation - JSON navigation: query loaded dict/list payloads with
JsonInjesterselectors for dotted paths, existence checks, filters, plucks, defaults, and casting - Request logging: append thread-safe JSONL request records and inspect recent entries by time window
Installation
Install from PyPI:
pip install pypercache
Or install from source:
git clone https://github.com/BrandonBahret/PyperCache.git
cd PyperCache
pip install .
Quick Start
See the full documentation, examples, and API reference:
Read the docs webpage
Find api wrapper examples
At a glance
At the center is Cache, which stores keyed records to disk and returns CacheRecord objects. Each record exposes .query, which gives you a JsonInjester over the loaded payload so you can navigate nested data without writing long chains of dict.get(...) calls.
@apimodel sits on top of that. Decorate a class and it gains a dict-accepting constructor, nested hydration, aliases, timestamp parsing, and lazy fields. Store with cast=MyModel, then retrieve a typed object later with cache.get_object().
ApiWrapper composes requests, Cache, and optionally RequestLogger into a higher-level base class for HTTP clients. Subclass it, add thin endpoint methods, and let the wrapper handle URL joining, cache lookup, response decoding, and model hydration.
API Wrapper
pypercache.api_wrapper.ApiWrapper provides a base class for building small synchronous API clients on top of requests, Cache, and RequestLogger.
from pypercache.api_wrapper import ApiWrapper
from pypercache.models.apimodel import apimodel
@apimodel
class Widget:
id: int
name: str
class WidgetClient(ApiWrapper):
...
def list_widgets(self) -> list[Widget]:
return self.request("GET", "/widgets", expected="json", cast=list[Widget])
Example
The snippet below demonstrates every major feature in one pass: choosing a backend, TTL, typed objects, query navigation, and request logging.
import math
from datetime import datetime
from typing import Annotated
from pypercache import Cache, RequestLogger
from pypercache.models.apimodel import Alias, Timestamp, apimodel
# ── 1. Backend is chosen by file extension ──────────────────────────────────
cache = Cache(filepath="api-cache.db") # .pkl / .json / .manifest / .db
log = RequestLogger("api_requests.log")
# ── 2. Define a typed model ──────────────────────────────────────────────────
@apimodel
class SearchResult:
total: int
next_page: Annotated[str | None, Alias("nextPage")]
fetched_at: Annotated[datetime, Alias("fetchedAt"), Timestamp()]
hits: list
# ── 3. Fetch-or-cache pattern ────────────────────────────────────────────────
KEY = "search:v1:python"
if not cache.is_data_fresh(KEY):
payload = {
"total": 3,
"nextPage": None,
"fetchedAt": "2026-04-19T12:34:56Z",
"hits": [
{"name": "Alice", "role": "staff", "score": 92},
{"name": "Bob", "role": "guest", "score": 74},
{"name": "Carol", "role": "staff", "score": 88},
],
}
cache.store(KEY, payload, expiry=3600, cast=SearchResult)
log.log(uri="/api/search?q=python", status=200)
# ── 4. Retrieve a typed object ───────────────────────────────────────────────
result: SearchResult = cache.get_object(KEY) # SearchResult instance
print(result.total) # 3
print(result.next_page) # None
print(result.fetched_at.isoformat()) # 2026-04-19T12:34:56+00:00
# ── 5. Query without mutating the payload ───────────────────────────────────
q = cache.get(KEY).query
print(q.get("total")) # 3
print(q.get("hits?role=staff.name")) # [Alice, Carol]
print(q.get("hits?name*")) # ['Alice', 'Bob', 'Carol']
print(q.get("hits?role=staff", select_first=True)["name"]) # 'Alice'
member: StaffMember = q.get("hits?role=staff", select_first=True, cast=StaffMember)
# ── 6. Inspect the request log ───────────────────────────────────────────────
for entry in log.get_logs_from_last_seconds(60):
print(entry.data["uri"], entry.data["status"])
License
MIT
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file pypercache-0.1.8.tar.gz.
File metadata
- Download URL: pypercache-0.1.8.tar.gz
- Upload date:
- Size: 86.0 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.13.13
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
e62c062045929743574934a4208539acd18f8b812616f70f534ac22adac870fc
|
|
| MD5 |
9bbc36489a0bab495b9187d5fd07eec8
|
|
| BLAKE2b-256 |
1ac5cd6733244b9242a5905df14673d91d2ffbf6d5bcb3a30e88a2f125cb8f2f
|
File details
Details for the file pypercache-0.1.8-py3-none-any.whl.
File metadata
- Download URL: pypercache-0.1.8-py3-none-any.whl
- Upload date:
- Size: 50.8 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.13.13
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
2fbb0c4969f07f5c1989190e46b7c728df81d971e8e69a0c6d85808aa0ae6fa6
|
|
| MD5 |
72398cbdff8023fca6c578915c32ef0b
|
|
| BLAKE2b-256 |
d2f5234497595881da66aed9ad0f511580b5d0990de0c9cdcf8044a5c13b4629
|