Redis schema migration tool with Pydantic model tracking
Project description
redis-queen
Schema migration tool for Redis-backed Pydantic models. Track model changes, generate versioned migration scripts, and apply them using async SCAN-based key iteration.
Installation
# pyproject.toml
dependencies = ["redis-queen"]
Quick start
1. Decorate your models
from pydantic import BaseModel, Field, RootModel
from redis_queen import redis_queen, migrates_from
from typing import Annotated
@redis_queen(key="user:{user_id}:profile:")
class UserProfile(BaseModel):
name: str
email: str
@redis_queen(key="agent:session:{session_id}:display")
class DisplayMessages(RootModel[list[DisplayMessage]]):
"""Stored as a JSON array."""
# Field renames
@redis_queen(key="item:{item_id}")
class Item(BaseModel):
title: Annotated[str, migrates_from("name")] # tracks rename from "name"
Key patterns use f-string style placeholders. match is auto-derived by replacing {...} with * for SCAN.
2. Configure in pyproject.toml
[tool.redis-queen]
migrations_dir = "migrations"
deletion_protection = false # or true, or an integer (TTL in seconds)
[tool.redis-queen.profiles.default]
default = true
host = "localhost" # defaults to localhost
port = 6379 # defaults to 6379
db = 0 # defaults to 0
migrations_collection = "my_collection"
model_search_path = ["myapp.models"]
host, port, and db default to localhost, 6379, and 0. All string values support $ENV_VAR expansion.
Profile resolution order: --profile CLI flag > REDIS_QUEEN_PROFILE env var > profile with default = true.
Multiple profiles (e.g. local + docker):
[tool.redis-queen.profiles.default]
default = true
migrations_collection = "agent"
model_search_path = ["agent.types"]
[tool.redis-queen.profiles.docker]
host = "redis"
migrations_collection = "agent"
model_search_path = ["agent.types"]
3. Generate and apply migrations
# Create the initial schema snapshot
redis-queen revision -m "initial"
# Show current state
redis-queen show
# After changing models, generate a new revision
redis-queen revision -m "add email field"
# Check if there are pending changes (exit 1 if none, useful in CI)
redis-queen revision --check
# Show what changed
redis-queen diff
# Apply pending migrations (stages first, prompts for confirmation)
redis-queen up
# Apply without confirmation
redis-queen up --auto-apply
# Downgrade to a specific revision
redis-queen down 0001
redis-queen down --root # revert all
# Reset to a specific revision (up or down as needed)
redis-queen reset 0002
redis-queen reset --root
All mutation commands (up, down, reset) support --auto-apply to skip the staging confirmation prompt.
Model discovery
model_search_path accepts:
- Directories:
"src/myapp/models"-- walks all.pyfiles - Single files:
"src/myapp/models.py" - Dotted module paths:
"myapp.models"-- imports the module and recursively walks all submodules
Schema tracking
Snapshots use model_json_schema(), which captures the full recursive schema including nested models. A change to any nested model triggers a new revision.
Generated migrations
Revisions auto-generate upgrade and downgrade functions:
- Field added with default -- sets the default value
- Field added, optional (factory) -- infers zero-value from type (
[],{},"",0, etc.) - Field added, required, no default --
# TODOcomment - Field removed -- backs up values, deletes from data
Deletion protection
Controls what happens when fields are removed:
[tool.redis-queen]
deletion_protection = false # backup without TTL (default)
deletion_protection = true # generate TODO, don't delete
deletion_protection = 3600 # backup with 1-hour TTL
When using an integer TTL, backups expire after the specified seconds. If a downgrade runs after the TTL, a warning is emitted for each key where the backup has expired and fields cannot be restored.
Python API
from redis_queen import (
redis_queen,
migrates_from,
auto_migrate_up,
migrate_up,
migrate_down,
apply_plan,
find_one,
get_one,
)
Auto-migrate on startup
from redis_queen import auto_migrate_up
# Resolves config, connects to Redis, applies all pending migrations.
# Uses REDIS_QUEEN_PROFILE env var or default profile.
await auto_migrate_up()
FastAPI lifespan example
from redis_queen import auto_migrate_up
@asynccontextmanager
async def lifespan(app: FastAPI):
await auto_migrate_up()
yield
Query utilities
# find_one: format key pattern with args/kwargs, then GET
profile = await find_one(UserProfile, "123")
profile = await find_one(UserProfile, user_id="123")
raw = await find_one(UserProfile, "123", return_raw=True)
# get_one: GET by full literal key
profile = await get_one(UserProfile, "user:123:profile:")
raw = await get_one(UserProfile, "user:123:profile:", return_raw=True)
Low-level API
For full control, use migrate_up / migrate_down directly with an explicit Redis client, migrations dir, and model list. See the CLI command implementations for examples.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file redis_queen-0.2.0.tar.gz.
File metadata
- Download URL: redis_queen-0.2.0.tar.gz
- Upload date:
- Size: 46.3 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
4f705c4c0cd2f001e109d5b3221425d24585f1b902604fddc32a652f8fe5412c
|
|
| MD5 |
cf2da99ae0253bb09e9696338bd23206
|
|
| BLAKE2b-256 |
aa34b3fe5d449c8c6246fdcb751774c53195e3e289d6679cf240424165d47d90
|
Provenance
The following attestation bundles were made for redis_queen-0.2.0.tar.gz:
Publisher:
release-please.yml on mahdilamb/redis-queen
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
redis_queen-0.2.0.tar.gz -
Subject digest:
4f705c4c0cd2f001e109d5b3221425d24585f1b902604fddc32a652f8fe5412c - Sigstore transparency entry: 1280662217
- Sigstore integration time:
-
Permalink:
mahdilamb/redis-queen@844dce31df3ebf7f79c4ba582bdcae0d7c960245 -
Branch / Tag:
refs/heads/main - Owner: https://github.com/mahdilamb
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
release-please.yml@844dce31df3ebf7f79c4ba582bdcae0d7c960245 -
Trigger Event:
push
-
Statement type:
File details
Details for the file redis_queen-0.2.0-py3-none-any.whl.
File metadata
- Download URL: redis_queen-0.2.0-py3-none-any.whl
- Upload date:
- Size: 27.2 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
713353cfe1c3c0ab1189cc2b34fc6968c58f1bef8c3225ca91dd04bc65a3f31c
|
|
| MD5 |
99cedcbbb594c8a14068446ef8de06e0
|
|
| BLAKE2b-256 |
d8b9dbf37f55e38447fa7fcb01bdd0f4adc8c1361ca0725a0dce828efec0100e
|
Provenance
The following attestation bundles were made for redis_queen-0.2.0-py3-none-any.whl:
Publisher:
release-please.yml on mahdilamb/redis-queen
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
redis_queen-0.2.0-py3-none-any.whl -
Subject digest:
713353cfe1c3c0ab1189cc2b34fc6968c58f1bef8c3225ca91dd04bc65a3f31c - Sigstore transparency entry: 1280662221
- Sigstore integration time:
-
Permalink:
mahdilamb/redis-queen@844dce31df3ebf7f79c4ba582bdcae0d7c960245 -
Branch / Tag:
refs/heads/main - Owner: https://github.com/mahdilamb
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
release-please.yml@844dce31df3ebf7f79c4ba582bdcae0d7c960245 -
Trigger Event:
push
-
Statement type: