Skip to main content

Versioned schema & data migrations for neomodel Neo4j graphs

Project description

Crochet

Versioned schema & data migrations for neomodel Neo4j graphs.

Crochet is a Git-backed, migration-driven framework that makes neomodel-defined Neo4j graphs evolvable, auditable, and rollback-safe without relying on database introspection.

Problem It Solves

  • neomodel has no native schema diff or migration system
  • Neo4j is schemaless, so schema drift is silent
  • Data loading and schema evolution are often intertwined but unmanaged
  • Rollbacks are usually impossible or unsafe
  • Git history and database state frequently diverge

Crochet enforces alignment between neomodel code, data ingests, and the live graph.

Installation

pip install crochet

For development:

pip install -e ".[dev]"

Quick Start

1. Initialize a project

crochet new-project --name my-graph

This creates:

my-graph/
  crochet.toml          # project config
  models/               # neomodel definitions
  migrations/           # migration files
  .crochet/ledger.db    # SQLite ledger

2. Create node and relationship models

crochet create-node Person
crochet create-relationship Friendship --rel-type FRIENDS_WITH

Each model gets an immutable __kgid__ identifier. Models can be renamed or moved across files without losing identity, because the __kgid__ is what Crochet tracks — not class names or file paths.

# models/person.py
from neomodel import StructuredNode, StringProperty, IntegerProperty

class Person(StructuredNode):
    __kgid__ = "person_v1"
    name = StringProperty(required=True, unique_index=True)
    age = IntegerProperty(index=True)

3. Create a migration

crochet create-migration "add person node"

Crochet snapshots the current schema IR, diffs it against the previous snapshot, and scaffolds a migration file with detected changes as comments:

# migrations/0001_add_person_node.py

revision_id = "0001_add_person_node"
parent_id = None
schema_hash = "a1b2c3..."
rollback_safe = True

def upgrade(ctx):
    ctx.add_unique_constraint("Person", "name")
    ctx.add_index("Person", "age")

def downgrade(ctx):
    ctx.drop_index("Person", "age")
    ctx.drop_unique_constraint("Person", "name")

4. Apply migrations

crochet upgrade              # apply all pending
crochet upgrade --dry-run    # preview without executing
crochet upgrade --target 0001_add_person_node  # apply up to a specific revision

5. Revert migrations

crochet downgrade            # revert the most recent migration
crochet downgrade --target 0001_add_person_node  # revert down to a target

Rollback-unsafe migrations will refuse to downgrade and raise an error.

6. Check status and verify

crochet status     # show applied/pending migrations, head, batches
crochet verify     # check ledger chain, file presence, schema hash consistency

Core Concepts

Intermediate Representation (IR)

neomodel files are parsed into an intermediate schema representation. IR snapshots can be hashed, serialized, and diffed. No Neo4j connection is required for schema comparison.

Hash-Chained Migrations

Migrations are ordered by a parent chain (Alembic-style). Each migration records the schema hash at the time it was created, so drift between code and migrations is detectable.

SQLite Ledger

A local SQLite database (.crochet/ledger.db) is the authoritative record of:

  • Applied migrations and their order
  • Dataset batches with file checksums and loader versions
  • Schema snapshots for diffing

Deterministic Data Ingest

Data loading is a first-class migration operation. The MigrationContext provides helpers for batch-tracked ingests:

def upgrade(ctx):
    batch_id = ctx.begin_batch()
    ctx.create_nodes("Person", [
        {"name": "Alice", "age": 30},
        {"name": "Bob", "age": 25},
    ])

Every node and relationship created through a batch is tagged with _crochet_batch, enabling delete-by-batch rollback.

Rollback Semantics

Rollbacks are explicitly declared, not assumed:

  • Append-only ingests support delete_nodes_by_batch / delete_relationships_by_batch
  • Destructive transforms must set rollback_safe = False
  • Unsafe downgrades are prevented by policy

Migration Context Operations

The MigrationContext passed to upgrade() and downgrade() provides:

Operation Description
add_unique_constraint(label, prop) Create a uniqueness constraint
drop_unique_constraint(label, prop) Drop a uniqueness constraint
add_node_property_existence_constraint(label, prop) Create a NOT NULL constraint
drop_node_property_existence_constraint(label, prop) Drop a NOT NULL constraint
add_index(label, prop) Create an index
drop_index(label, prop) Drop an index
rename_label(old, new) Rename a node label
rename_relationship_type(old, new) Rename a relationship type
add_node_property(label, prop, default) Add a property with optional default
remove_node_property(label, prop) Remove a property
rename_node_property(label, old, new) Rename a property
run_cypher(cypher, params) Execute raw Cypher
begin_batch(batch_id) Start a tracked data batch
create_nodes(label, data) Batch-create nodes
create_relationships(src, tgt, type, data) Batch-create relationships
delete_nodes_by_batch(label, batch_id) Delete nodes by batch
delete_relationships_by_batch(type, batch_id) Delete relationships by batch

Configuration

crochet.toml:

[project]
name = "my-graph"
models_path = "models"
migrations_path = "migrations"

[neo4j]
uri = "bolt://localhost:7687"
username = "neo4j"

[ledger]
path = ".crochet/ledger.db"

Neo4j credentials can be overridden with environment variables:

  • CROCHET_NEO4J_URI
  • CROCHET_NEO4J_USERNAME
  • CROCHET_NEO4J_PASSWORD

CLI Reference

Command Description
crochet new-project Initialize a new Crochet project
crochet create-node NAME Scaffold a StructuredNode model
crochet create-relationship NAME Scaffold a StructuredRel model
crochet create-migration DESC Create a new migration file
crochet upgrade Apply pending migrations
crochet downgrade Revert the most recent migration
crochet status Show migration status
crochet verify Run verification checks

Design Principles

  • No hidden magic — all changes are explicit migration files
  • Code > database state — neomodel files are the source of truth
  • Determinism over convenience — schema IR is hashed and diffed
  • Rollback is a contract, not a guess — explicitly declared per migration
  • Git history and graph state must agree — ledger + hash chains enforce this

Development

pip install -e ".[dev]"
pytest

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

crochet_migration-0.1.0.tar.gz (29.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

crochet_migration-0.1.0-py3-none-any.whl (30.4 kB view details)

Uploaded Python 3

File details

Details for the file crochet_migration-0.1.0.tar.gz.

File metadata

  • Download URL: crochet_migration-0.1.0.tar.gz
  • Upload date:
  • Size: 29.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.4

File hashes

Hashes for crochet_migration-0.1.0.tar.gz
Algorithm Hash digest
SHA256 ff5145c0f59e98bb06f958880ad16969cb306eb86eb7dc052cd7a177cf0a3cb4
MD5 0a35163adbd7ee207db250c9e3993f77
BLAKE2b-256 03bf4dad97a697eb923823bcda62b78cf3be91b7fc31c77f21ce1ed496dace96

See more details on using hashes here.

File details

Details for the file crochet_migration-0.1.0-py3-none-any.whl.

File metadata

File hashes

Hashes for crochet_migration-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 b3698a156ead26bbf5577348b614c825f438acb7de80194feedd1e6641acd837
MD5 2b96f4b936e3bc804bd574881aa5e45f
BLAKE2b-256 85c9c1b791354c7cec5ba6893373ebfe383c685bc463222b69e45c18607f5fba

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page