Skip to main content

Disk-backed collections powered by Pydantic models.

Project description

diskdantic

Instead of having an ORM on top of a database, why not have a collection on top of a folder?

Disk-backed collections powered by Pydantic models. This is pretty much the whole API:

from datetime import date
from pydantic import BaseModel
from diskdantic import Collection


class BlogPost(BaseModel):
    title: str
    date: date
    tags: list[str]
    draft: bool = False
    content: str


posts = Collection(
    BlogPost,
    path="./blog/posts",
    format="markdown",    # required when the folder is empty
    body_field="content", # required when format is markdown (for the body)
)

recent = posts.filter(lambda post: not post.draft).order_by("-date").head(3)
for post in recent:
    print(post.title)

new_post = BlogPost(
    title="Hello World",
    date=date.today(),
    tags=["intro"],
    content="# Hello\n\nIt works!",
)
posts.add(new_post)

There are also loads of utility functions.

# Get all published posts
published = posts.filter(lambda post: not post.draft)

# Get posts with specific tag
intro_posts = posts.filter(lambda post: "intro" in post.tags)

# Chain filters
recent_published = posts.filter(lambda p: not p.draft).filter(lambda p: p.date.year == 2025)

It's meant to work with markdown files, but it should also work with yaml/json.

API Reference

Query Methods

  • filter(predicate) - Filter items by a predicate function
  • order_by(field) - Sort by field (prefix with - for descending)
  • head(n=5) - Get first n items
  • tail(n=5) - Get last n items
  • to_list() - Materialize query to list
  • count() - Count matching items
  • first() - Get first item or None
  • last() - Get last item or None
  • exists(predicate=None) - Check if any items match
  • get(filename) - Load specific file by name

Lifecycle Methods

  • add(model, path=None) - Add new model to collection (returns Path)
  • update(model) - Update existing model on disk (returns Path)
  • upsert(model) - Add if new, update if exists (returns Path)
  • delete(target) - Delete by model, filename, or Path
  • refresh(model) - Reload model from disk
  • path_for(model) - Get disk path for a model

Iteration

Collections are iterable and return model instances:

for post in posts:
    print(post.title)

Why?

It makes it easier to write a custom CMS on top of your disk, which is nice. But it also feels like a fun thing that should exist. It's mainly a fun brainfart for now, but I can see some areas where I might make it better too.

  1. Figure out a nice API for a nested collection. The library has one now, but undocumented for a reason.
  2. Maybe make it more performant by seeing how far I can push the lazy loading. Though I doubt this will be worth it.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

diskdantic-0.1.0.tar.gz (27.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

diskdantic-0.1.0-py3-none-any.whl (9.7 kB view details)

Uploaded Python 3

File details

Details for the file diskdantic-0.1.0.tar.gz.

File metadata

  • Download URL: diskdantic-0.1.0.tar.gz
  • Upload date:
  • Size: 27.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.7.13

File hashes

Hashes for diskdantic-0.1.0.tar.gz
Algorithm Hash digest
SHA256 6436eac963211b7b43258d60a0c072629cda735233ae37b125f25401ef9bf5aa
MD5 1c58a5fa2e7241dc489ebca4bcb579fc
BLAKE2b-256 d9b2b1d7273d13d1862d5b70d51645ee01a7fd1b0214cb887a9eed534c0d3614

See more details on using hashes here.

File details

Details for the file diskdantic-0.1.0-py3-none-any.whl.

File metadata

File hashes

Hashes for diskdantic-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 a890c981b9bce8e34aa8c7ae78fb163c4df783d80efaf76787f03b96de359106
MD5 5872983fea2228b2abf0a2f599e4ab66
BLAKE2b-256 41da856c8e042e198f7af90c5c66cae11d1e9e95dff2de801d328b93b3dc94bd

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page