Skip to main content

Unified temporal reasoning core for AI: bitemporal storage, causal DAG, Allen interval algebra, branching projections, lifecycle, LLM tool-call adapter, binary persistence, and Python bindings.

Project description

GALAHAD-TemporalSlicer

CI PyPI License: MIT

A unified temporal reasoning core for AI systems: bitemporal storage, causal DAG, Allen interval algebra, branching projections, lifecycle, a first-class LLM tool-call adapter, binary persistence, and Python bindings. C++20 core, usable from C++ or Python. Research-grade.

GALAHAD exists because time is the weakest part of most AI systems. LLMs hallucinate dates. Robots conflate "now" with "last observed." Planners treat the future as an untyped blob. Bitemporal databases can tell you what was known when, but they cannot reason about causation. Causal graph libraries can trace "why" but do not understand intervals. Allen's interval algebra can say "during" but cannot branch into competing futures. There is no open-source project that unifies all four. GALAHAD is that unification.

Long-term target: the temporal substrate for serious AI — LLM tool-call surfaces, robotics world models, simulation and planning systems, agent reasoning loops. Time becomes a first-class capability, not an afterthought.

The three questions GALAHAD answers

Every interesting temporal question an AI asks reduces to one of three shapes:

  1. "Why did that happen?" — causal ancestry with Allen-relation constraints, filtered by what the system could have known at the time.
  2. "What did the system believe at that moment?" — bitemporal as-of queries over events, separating valid time (when the event was true in the world) from transaction time (when the system learned of it).
  3. "What should happen next?" — branching projections. Multiple competing futures coexist in the same store without contaminating ground truth, each with its own confidence, each promotable to reality when observation confirms it.

A single engine answers all three. The same event store holds ground truth and hypotheticals. The same query path filters by as-of time, causal graph position, and branch identity. This is the structural moat.

Architecture

Three layers, bottom-up:

python/      galahad             — pybind11 bindings: full C++ surface from Python
adapters/    LLMToolAdapter      — JSON tool-call surface for any LLM framework
persistence/ TemporalPersistence — binary save/load with full round-trip
engine/      TemporalEngine      — high-level reasoning: explain(), whatHappenedDuring(), knobs
core/        TemporalCore        — the substrate: bitemporal events, causal DAG, Allen algebra,
                                    branching projections, lifecycle ops, auto-maintained indices

TemporalCore is a self-contained C++ library. TemporalEngine wraps it with higher-level reasoning primitives and is the intended consumer API. LLMToolAdapter exposes the reasoning surface as a vendor-neutral JSON tool-call API so any agent framework (Anthropic, OpenAI, LangChain, custom loops) can reach for GALAHAD with zero custom glue.

What's in the core

  • Bitemporal event model. Every event has valid_from/valid_to (world time) and recorded_at (transaction time). Queries can ask "what was true at time T" and "what did the system know at time T" independently.
  • Causal DAG. Events cite their causes via causal_links. The core auto-maintains both the forward causal graph and its reverse, detects cycles, and exposes transitive closure (getAncestors, getDescendants) and direct-edge queries (getCauses, getEffects).
  • Allen's 13 interval relations as first-class predicates. getAllenRelation(a,b) returns one of Precedes, Meets, Overlaps, FinishedBy, Contains, Starts, Equals, StartedBy, During, Finishes, OverlappedBy, MetBy, PrecededBy. findRelated(id, relation) returns every event standing in that relation to a source. Dispatches on the relation family to walk only the relevant slice of the time index.
  • Branching projections. addProjection(event) stores a hypothetical on a named branch. Queries filter by branch: nullopt = all non-refuted branches, "main" = ground truth only, "X" = branch X only. Branches are fully isolated — two competing futures cannot see each other.
  • Branch lifecycle. promoteBranch(name) appends main-branch copies of every event on that branch with a monotonic recorded_at, preserving the originals so as-of queries before the promotion still see the projection form. refuteBranch marks a branch as falsified without destroying its events (queryable by explicit name, hidden from default queries). pruneBranch destructively removes events on a branch. Main branch is protected from promote/refute/prune.
  • Monotonic transaction time. TemporalCore::now() returns a strictly-increasing TimePoint even under burst calls. computePromoteTime() additionally guarantees the new timestamp exceeds every existing event's recorded_at so bitemporal ordering stays consistent through lifecycle operations.

What's in the engine

  • explain(id) — full causal ancestry of an event, returned as vector<TemporalEvent> sorted by valid_from. Includes a require_completed_before knob (default true) that excludes ancestors whose valid_to extends past the target's valid_from. This lets callers choose between "causal history" (completed causes) and "causal context" (ongoing conditions). Fully bitemporal: as_of parameter replays the system's past belief state.
  • whatHappenedDuring(window) — events overlapping a time window, branch-filtered, returned as full TemporalEvent objects.
  • explainWith(target, mutation) — hypothetical explain: clones the core, applies a mutation, then runs explain on the fork without touching the original. Ask "if this event had happened, what would its causal explanation look like?" (see Counterfactual queries below).

Counterfactual queries (new in 0.2.0)

GALAHAD is the only OSS temporal engine with first-class counterfactual primitives because it already has what they require: an event-level refutation set and bitemporal semantics. Two new operations expose this:

  • TemporalCore::whyNot(id) — "why did this event NOT happen?" Returns one entry per refuted projection branch that contained a prediction of the given event id. Each entry includes the refuted branch name, the predicted event (with its metadata and confidence), and the would-have-been causal ancestry walked across all branches so you see the full hypothetical chain — not just the subset that survived refutation. Empty if the event actually happened on main, if no prediction of it was ever made, or if every predicting branch was promoted rather than refuted.

  • TemporalEngine::explainWith(target, mutation) — hypothetical explain. Clone the core, apply the mutation as a fresh addEvent, run explain(target) on the fork. Original core untouched. Use this to probe causal hypotheses without committing them — "if I had added this event, what would its ancestry have been?"

  • TemporalCore::clone() — the enabling primitive. Deep copies the full core state (events, indices, pools, refutation set, clock) into an independent TemporalCore. All members are value types, so the implicit copy constructor does a true deep copy with zero shared state. explainWith is a thin wrapper on top of this.

The LLM adapter exposes two new JSON tools for counterfactuals: why_not and explain_with, bringing the adapter's vendor- neutral tool count to 19. An agent can now ask "what refuted branches predicted this event?" and "what would the causal chain look like if this mutation had happened?" with one tool call each, in addition to all the existing forward queries.

What's in the adapter

  • LLMToolAdapter — a vendor-neutral JSON tool-call surface over the full reasoning API. Two entry points:
    • getToolSchemas() returns an Anthropic-style array of {name, description, input_schema} descriptors any LLM framework can register. Seventeen tools: now, add_event, add_projection, get_event, query_range, explain, what_happened_during, get_ancestors, get_descendants, get_causes, get_effects, find_related, promote_branch, refute_branch, prune_branch, is_refuted, and list_tools.
    • handleToolCall(name, args) dispatches and returns a structured envelope: {"ok": true, "result": ...} on success or {"ok": false, "error": "..."} on failure. Every exception path is caught and surfaced as JSON — the LLM never sees a crash.
  • Timestamp round-tripping. Every result carries both ISO 8601 UTC strings ("2026-04-14T12:00:00.000Z") and int64 nanoseconds since epoch. Inputs accept either. LLMs read ISO fluently; the int64 form guarantees exact round-trip for agent tool-chaining.
  • Allen relation names are exposed as snake_case strings (precedes, meets, during, met_by, etc.) so a model reaches for them by their natural names.
  • Dependency: nlohmann/json v3.11.3, pulled via CMake FetchContent. Zero manual setup.

Quick start

#include "temporal_core.h"
#include "temporal_engine.h"

using namespace galahad;
using namespace std::chrono;

int main() {
    TemporalCore core;
    TemporalEngine engine(core);
    const auto t0 = Clock::now();
    auto ms = [](int n) { return milliseconds(n); };

    // Ground truth: an agent perceives, infers, decides, acts.
    core.addEvent({"perceive", t0,        t0+ms(5),  t0,
                   "perception", {{"obs","door_open"}}, {}});
    core.addEvent({"infer",    t0+ms(5),  t0+ms(15), t0+ms(5),
                   "inference", {{"why","wind"}}, {"perceive"}});
    core.addEvent({"decide",   t0+ms(15), t0+ms(20), t0+ms(15),
                   "decision",  {{"choose","close"}}, {"infer"}});
    core.addEvent({"act",      t0+ms(20), t0+ms(30), t0+ms(200),
                   "action",    {{"do","close"}}, {"decide"}});

    // "Why did the agent act?"
    auto why = engine.explain("act");
    // why.causes == [perceive, infer, decide] in temporal order

    // "What did the system believe at t0+100ms?" The action was only
    // recorded at t0+200ms, so it is invisible to this past query.
    auto past = core.queryRange({t0, t0+ms(500)}, t0+ms(100));
    // past contains perceive, infer, decide — but not act

    // "What should happen next?" Two competing projected futures.
    TemporalEvent futClose;
    futClose.id = "fut_close";
    futClose.valid_from = t0+ms(30);
    futClose.valid_to   = t0+ms(40);
    futClose.recorded_at = t0+ms(20);
    futClose.type = "projected_action";
    futClose.causal_links = {"act"};
    futClose.branch_id = "close_door";
    futClose.confidence = 0.7;
    core.addProjection(futClose);

    TemporalEvent futIgnore = futClose;
    futIgnore.id = "fut_ignore";
    futIgnore.branch_id = "ignore";
    futIgnore.confidence = 0.3;
    core.addProjection(futIgnore);

    // Branch-scoped query: only the "close_door" future.
    auto close_only = core.queryRange(
        {t0, t0+ms(500)}, std::nullopt,
        std::optional<std::string>{"close_door"});

    // Observation confirms one future. Promote it to ground truth and
    // refute the alternative.
    core.promoteBranch("close_door");
    core.refuteBranch("ignore");

    // Bitemporal honesty preserved: as-of before the promotion, the
    // system's past belief still sees the projection form, not the
    // promoted main-branch copy.
}

LLM integration

Any tool-calling LLM framework can drive GALAHAD with a few lines of glue. The adapter speaks vendor-neutral JSON, so the same code wires up to Anthropic SDK tool_use, OpenAI function calling, LangChain, or a custom agent loop.

#include "llm_tool_adapter.h"
using namespace galahad;
using nlohmann::json;

TemporalCore core;
TemporalEngine engine(core);
LLMToolAdapter adapter(core, engine);

// 1. Register tools with your LLM framework.
json schemas = adapter.getToolSchemas();
// schemas is an array of 17 {name, description, input_schema} objects.
// Hand it to your framework's tool registry.

// 2. Dispatch the LLM's tool calls.
//    When the model picks a tool, forward the JSON args:
json result = adapter.handleToolCall(
    "explain",
    json{{"id", "act"}}
);
// result == {"ok": true, "result": {"causes": [...], "completed_before_target": true}}

An agent loop using GALAHAD walks the full perceive → project → observe → reconcile cycle through tool calls alone:

  1. now — get a monotonic transaction-time anchor
  2. add_event — record a perception with causal links and metadata
  3. explain — ask the core why a downstream event happened, read the chain back
  4. add_projection — stash a hypothetical future on a named branch with confidence
  5. query_range with branch: "<name>" — inspect a specific projected future
  6. promote_branch / refute_branch — reconcile projection with observation
  7. Ask explain again with as_of: <earlier time> — replay the system's past belief state, honestly

No custom C++ integration, no bespoke serialization, no timestamp hallucination. That full loop did not exist in any open-source project before GALAHAD.

Python

pybind11 bindings expose the full C++ surface — TemporalCore, TemporalEngine, LLMToolAdapter, TemporalPersistence, plus the value types TemporalEvent, TimeWindow, AllenRelation, Explanation — with idiomatic Python ergonomics (snake_case methods, Python datetime for time points, dict for event data, None for optional arguments, list[dict] returned from the adapter).

Install from PyPI:

pip install galahad-temporal                  # or: pip install "galahad-temporal[anthropic]"
python -c "import galahad; print(galahad.__version__)"

Live at https://pypi.org/project/galahad-temporal/ . Current release is 0.1.1 — the first install compiles from source (~1 minute) because prebuilt wheels via cibuildwheel are on the near-term roadmap. Or install straight from the repo:

git clone https://github.com/JacobFlorio/GALAHAD-TemporalSlicer
cd GALAHAD-TemporalSlicer
pip install .                     # or: pip install .[anthropic]
python -c "import galahad; print(galahad.__version__)"

pip install . uses scikit-build-core with pybind11 as a build-system dependency. It compiles a single extension module (galahad.*.so) and installs it directly — no PYTHONPATH dance, no test binaries pulled into the wheel, no bench harness compiled. The install step takes roughly a minute on a commodity machine (FetchContent pulls nlohmann/json once; pybind11 comes from the build env).

The optional [anthropic] extra installs the Anthropic SDK so examples/anthropic_demo.py runs end-to-end.

You can still build everything (C++ tests, bench, the Python module, adapter/persistence) from the repo without pip:

cmake -B build && cmake --build build -j
PYTHONPATH=build python3 python/test_galahad.py

Quick-start mirror of the C++ example above:

from datetime import datetime, timedelta, timezone
import galahad

core = galahad.TemporalCore()
engine = galahad.TemporalEngine(core)
t0 = datetime.now(timezone.utc)

def mk(id_, start, end, recorded, type_, data=None, links=None, branch="main"):
    e = galahad.TemporalEvent()
    e.id, e.valid_from, e.valid_to, e.recorded_at = id_, start, end, recorded
    e.type, e.data, e.causal_links, e.branch_id = type_, data or {}, links or [], branch
    return e

ms = lambda n: timedelta(milliseconds=n)
core.add_event(mk("perceive", t0,         t0+ms(5),  t0,        "perception", {"obs": "door_open"}))
core.add_event(mk("infer",    t0+ms(5),   t0+ms(15), t0+ms(5),  "inference",  {"why": "wind"}, ["perceive"]))
core.add_event(mk("decide",   t0+ms(15),  t0+ms(20), t0+ms(15), "decision",   {"choose": "close"}, ["infer"]))
core.add_event(mk("act",      t0+ms(20),  t0+ms(30), t0+ms(200),"action",     {"do": "close"}, ["decide"]))

# Why did the agent act?
why = engine.explain("act")
print([c.id for c in why.causes])   # ['perceive', 'infer', 'decide']

# What did the system believe at t0+100ms?
past = engine.explain("act", t0 + ms(100))
assert len(past.causes) == 0  # the action had not been recorded yet

# Drive the full agent surface via the LLM adapter from Python:
adapter = galahad.LLMToolAdapter(core, engine)
schemas = adapter.get_tool_schemas()          # list of dicts: register with any LLM framework
result = adapter.handle_tool_call("explain", {"id": "act"})
assert result["ok"] and len(result["result"]["causes"]) == 3

# Persist state across runs:
galahad.TemporalPersistence(core).save("state.gtp")
restored = galahad.TemporalCore()
galahad.TemporalPersistence(restored).load("state.gtp")

Lifetime is handled automatically — TemporalEngine, LLMToolAdapter, and TemporalPersistence each pin their backing TemporalCore (and each other where relevant) via py::keep_alive, so the Python garbage collector will not reclaim a core out from under a derived object. Pass a TemporalCore to a dozen wrappers and they will all keep it alive until the last one is dropped.

python/test_galahad.py exercises the full surface end-to-end: causal chain, bitemporal as-of replay, Allen relations, branching projections, refute + explicit-name override, LLM tool-call round-trip (including the nlohmann::json ↔ Python dict caster), and persistence save/load with state preservation. Run with PYTHONPATH=build python3 python/test_galahad.py.

Example: Claude reasoning through GALAHAD

examples/anthropic_demo.py wires the adapter straight into the Anthropic SDK. It pre-populates the canonical four-step agent trace (perceive → infer → decide → act, with a late-arriving action) and asks Claude two questions:

  1. Why did the action happen? Walk the causal chain.
  2. What would the system have believed at t0+100ms, when the action had not yet been recorded? Use as_of.

Claude is handed all 17 GALAHAD tools via get_tool_schemas() and runs an agent loop: for each tool_use in the response, the script calls adapter.handle_tool_call(name, args), serializes the JSON result back as a tool_result block, and continues until Claude emits end_turn. The second question specifically exercises the bitemporal distinction that makes GALAHAD structurally different from a plain event store — "what was true in the world" vs "what the system knew."

# Dry run: verifies the full GALAHAD + adapter + Python path
# without an API key (smoke-tests explain, as_of replay, find_related).
PYTHONPATH=build python3 examples/anthropic_demo.py

# Real run:
pip install anthropic
export ANTHROPIC_API_KEY=sk-...
PYTHONPATH=build python3 examples/anthropic_demo.py

The dry-run mode is the important part for CI and for anyone who wants to verify the demo works before spending tokens on it — no API key needed, prints the three smoke assertions, exits cleanly.

API surface

Mutation

void addEvent(TemporalEvent e);                  // ground truth
void addProjection(TemporalEvent e);             // hypothetical, non-main branch
void promoteBranch(const std::string& branch);   // projection -> ground truth
void refuteBranch(const std::string& branch);    // mark as falsified (non-destructive)
void pruneBranch(const std::string& branch);     // destructive removal

Query — bitemporal

std::optional<TemporalEvent> get(const std::string& id,
                                 std::optional<TimePoint> as_of = {},
                                 std::optional<std::string> branch = {}) const;

std::vector<TemporalEvent> queryRange(TimeWindow window,
                                      std::optional<TimePoint> as_of = {},
                                      std::optional<std::string> branch = {}) const;

Query — causal

std::vector<std::string> getCauses(const std::string& id, ...);      // direct parents
std::vector<std::string> getEffects(const std::string& id, ...);     // direct children
std::vector<std::string> getAncestors(const std::string& id, ...);   // transitive
std::vector<std::string> getDescendants(const std::string& id, ...); // transitive
bool hasCycle() const;

Query — Allen

AllenRelation getAllenRelation(const TemporalEvent& a, const TemporalEvent& b) const;
bool holds(AllenRelation r, const TemporalEvent& a, const TemporalEvent& b) const;
std::vector<std::string> findRelated(const std::string& id,
                                     AllenRelation r,
                                     std::optional<TimePoint> as_of = {},
                                     std::optional<std::string> branch = {}) const;

Engine

struct Explanation {
    std::vector<TemporalEvent> causes;     // sorted by valid_from
    bool completed_before_target = true;
};

Explanation explain(const std::string& id,
                    std::optional<TimePoint> as_of = {},
                    bool require_completed_before = true,
                    std::optional<std::string> branch = {}) const;

std::vector<TemporalEvent> whatHappenedDuring(TimeWindow window,
                                              std::optional<TimePoint> as_of = {},
                                              std::optional<std::string> branch = {}) const;

Explanation explainWith(const std::string& target_id,
                        const TemporalEvent& mutation,
                        std::optional<TimePoint> as_of = {},
                        bool require_completed_before = true,
                        std::optional<std::string> branch = {}) const;

Counterfactuals

struct CounterfactualExplanation {
    std::string branch;
    TemporalEvent hypothetical_event;
    std::vector<TemporalEvent> would_have_been_causes;
};

TemporalCore TemporalCore::clone() const;
std::vector<CounterfactualExplanation>
    TemporalCore::whyNot(const std::string& id) const;

Current status

This is a v0.1 research core. It is not a database, not persistent, not yet concurrent, not yet benchmarked against production systems. It is a complete, correct, internally-optimized skeleton that answers every question in the unified model.

Done:

  • Bitemporal event model with valid + transaction time
  • Causal DAG with cycle detection, transitive closure, auto-maintained forward + reverse indices
  • Allen's 13 interval relations + findRelated dispatched per relation family
  • Branching projections with full isolation
  • Branch lifecycle: promote (bitemporally honest), refute (non-destructive), prune (destructive). Main branch protected.
  • Monotonic transaction time clock
  • Engine layer with explain, whatHappenedDuring, bitemporal honesty, causal ordering, completed_before knob
  • uint32-interned event and branch ids — all hot-path comparisons are integer-only
  • Per-branch time indices for fast range and Allen queries on any single branch
  • Flat, sorted-vector event data field — one allocation per event's metadata
  • Lazy-rebuilt indices behind dirty flags; no stale-graph footguns
  • LLM tool-call adapter: 17 JSON tools, getToolSchemas + handleToolCall, ISO-8601 and int64 timestamp round-trip, Allen relations as snake_case strings, structured error envelopes. Vendor-neutral.
  • Persistence: single-file binary format, full save/load round-trip. Events, causal links, branches, refutations, and bitemporal ordering all survive the round-trip losslessly. Load bumps the monotonic clock so subsequent writes can't regress before loaded events.
  • Python bindings via pybind11: the full C++ surface exposed as idiomatic Python (snake_case methods, datetime time points, dict event data, automatic lifetime management via keep_alive). Includes an inline nlohmann::json ↔ Python dict/list type caster so the LLM adapter round- trips JSON naturally from Python.
  • Four C++ test binaries plus one Python test: correctness stress at 10k events, 100-branch isolation, adapter round-trip, persistence round-trip, and the full Python surface end-to-end

Not yet:

  • Incremental/append-only persistence (current v1 format writes full snapshots)
  • Concurrency (single-threaded)
  • Real benchmark harness with published numbers
  • Topological sort in explain (currently sorts by valid_from; equivalent for DAGs that respect causal-temporal ordering, not general)
  • Counterfactual queries (whyNot, hypothetical mutation)
  • Confidence propagation (stored, not composed)
  • Interval-tree time index (current sorted-vector is fine until bench says otherwise)
  • Nested/hierarchical branches
  • Non-Python language bindings (direct C API, Rust, etc.)

Benchmarks: bench/bench_temporal runs seven workloads end-to-end. See the Performance section below. Numbers are honest — single-threaded, single machine, Release build, sanity-checked on every run.

Build

cmake -B build
cmake --build build -j
./build/test_temporal
./build/test_engine
./build/test_adapter
./build/bench_temporal

The first cmake -B build pulls nlohmann/json via FetchContent (pinned, shallow). Subsequent configures are cache hits.

Requires CMake 3.20+ and a C++20 compiler. The project defaults to Release when you don't pick a build type; Debug still works with cmake -B build -DCMAKE_BUILD_TYPE=Debug.

Performance

Numbers from ./build/bench_temporal on a single-threaded commodity machine, Release build (-O3). The harness warms every index before measuring and sanity-checks every result. Reproduce with:

cmake -B build && cmake --build build -j && ./build/bench_temporal
Workload p50 p99 Notes
addEvent bulk (1M events) 749 ns ~1.33M events/sec
get() over 100k events 640 ns 1.2 µs
queryRange 100 µs window / 100k 92 µs 300 µs ~100 events materialized per call
queryRange all 100 branches (100k) 12.4 ms 21.1 ms full scan + materialize
queryRange one branch (1k of 100k) 33 µs 66 µs ~380× faster — per-branch time index
explain() at causal depth 1000 135 µs 176 µs 1000-step ancestry with full reconstruction
findRelated(Overlaps) / 100k 89 µs 379 µs family dispatch, walks only overlapping slice
findRelated(Meets) / 100k 119 µs 459 µs
findRelated(Precedes) / 100k 435 µs 647 µs ~50k results, cost linear in result size
findRelated(PrecededBy) / 100k 437 µs 861 µs
save 100k events (binary v1) 25 ms ~4M events/sec, 75 bytes/event
load 100k events (binary v1) 48 ms ~2M events/sec, full reindex on load

What these numbers mean:

  • Branch isolation is not a filter, it is a structural skip. Asking "what happened on branch X" takes ~33 µs because the per-branch time index iterates only that branch's 1000 events. The same window across all 100 branches takes 12.4 ms because it materializes all 100k. This is the difference between "branch filter applied at query time" and "branch-scoped index consulted by name."
  • explain() at depth 1000 is 135 µs — ~135 ns per ancestor for full causal traversal plus materialization. An agent can answer "why did this happen" over a thousand-step chain faster than a network round-trip.
  • findRelated algorithmic cost scales with the result set, not the corpus. Selective relations (Meets, Overlaps, a handful of results) are an order of magnitude faster than scan-everything relations (Precedes, PrecededBy, tens of thousands of results). The time index dispatch walks only the relevant slice; final cost is dominated by resolveEventId per result.
  • Persistence v1 is full-snapshot only. Save is pure serialization (4M events/s); load is intern + reindex (2M events/s) because it runs every event through the normal addEvent path. An append-only incremental format is on the near-term roadmap.

The bench harness is single-threaded and the core is single-threaded; concurrent readers and a real interval-tree time index are not v0.1 goals. Nothing in this section is compared against other systems yet — see the "What makes GALAHAD unique" section for why comparative benchmarks against XTDB / PyReason / etc. are the wrong frame. Numbers here are ourselves-against-ourselves, and they will move as we land incremental persistence, per-call materialization opt-outs, and a better time index.

Design notes

  • Bitemporal first. Every event carries both world-time (valid_from/valid_to) and transaction-time (recorded_at) from day one. This is the single most important modeling choice in the project. Without it, causal reasoning leaks foreknowledge from the future and projections cannot be replayed honestly.
  • Unified query path. as_of, branch, and Allen relations all compose. You can ask "events during window W that are ancestors of X, on branch Y, as the system knew them at time T." No special-case code paths — every query method takes the same filters.
  • No auto-magic. Lifecycle operations are explicit: promoting one branch does not auto-refute its siblings. If the caller wants both, they make two calls. Predictable beats convenient.
  • Branch isolation by construction. Branch filters gate both traversal and inclusion in causal queries, and the per-branch time indices make branch-scoped reads structurally fast, not just filtered-fast. Two projected futures referencing the same main-branch cause cannot see each other, even though they share that cause.
  • Path B interning. Public API uses human-readable strings. Internal storage uses uint32 handles with string-pool round-trip on read. The public ergonomics are preserved, the hot path is pointer-equality fast, and future bindings (Python, FFI) will not need to re-invent the translation layer.
  • Lazy index materialization. Time indices, branch time indices, and causal graphs are all rebuilt on first read after mutation. Bulk inserts amortize to one rebuild across the batch. Callers never have to call buildCausalGraph() explicitly; it still exists for backward compatibility but is a no-op in the steady state.

What makes GALAHAD unique

There are excellent projects in each of the adjacent categories:

  • Bitemporal DBs (XTDB, SirixDB, BarbelHisto) — great at as-of queries, but have no notion of causation or interval algebra, and no branching projections.
  • Causal / temporal reasoning libraries (PyReason, ETIA, tgLib) — great at graphs and relations, but are not bitemporal and treat the future as uniform with the past.
  • Allen interval algebra libraries — standalone implementations of the 13 relations, but without an event store or causal model.
  • Planning and simulation engines — handle branching futures, but as a separate concern from observation and belief update.

GALAHAD's goal is to be the one substrate where all of these compose. A query like "find every event that is a causal ancestor of X, happened during Y, was known at time T, and lives on branch B" is one line of code here and is not directly expressible in any single library in the categories above.

Roadmap

Near term

  • Windows support: port the adapter's POSIX time APIs (strptime, gmtime_r, timegm) to MSVC equivalents so cibuildwheel can add Windows wheels to the release matrix
  • Incremental persistence: append-only log alongside the snapshot format, so long-lived agents don't have to rewrite the full state on every checkpoint
  • Topological sort in explain for general DAGs

Medium term

  • Counterfactual queries (whyNot, hypothetical mutation)
  • Confidence propagation through causal chains
  • PyPI release (source install works via pip install . today; prebuilt wheels for Linux/macOS/Windows via cibuildwheel next)
  • More framework-specific examples alongside examples/anthropic_demo.py (LangChain, OpenAI function-calling, tool-use harness for local models)

Longer term

  • Concurrent readers / writer-exclusion
  • Interval-tree time index for true logarithmic range queries on long-lived events
  • Hierarchical branches ("close_door.option_A" nested under "close_door")
  • First robotics integration — plug into an active agent loop as the belief-and-plan substrate

Release process

CI and wheel publishing are driven by two GitHub Actions workflows in .github/workflows/.

Continuous integration

ci.yml runs on every push to main and every pull request. It builds the full C++ tree (core, engine, adapter, persistence, bench, all four test binaries), runs the four C++ test binaries end-to-end, runs the bench as a smoke check, installs the package with pip install ., and runs the Python test suite against the installed module from a neutral working directory so import galahad hits site-packages and not the build tree. Matrix covers Python 3.10 / 3.11 / 3.12 on Ubuntu plus one run on macOS. Windows is intentionally skipped until the adapter's POSIX time APIs (strptime, gmtime_r, timegm) are ported to the MSVC CRT equivalents.

Cutting a release

  1. Bump the version in both pyproject.toml and python/galahad.cpp (the m.attr("__version__") line).
  2. Commit with a descriptive message (don't skip the Co-Authored-By trailer — it matches the rest of the project's history).
  3. Tag and push:
    git tag -a v0.1.2 -m "release v0.1.2"
    git push origin v0.1.2
    

Automated wheel building

The v* tag triggers wheels.yml, which:

  • Uses cibuildwheel to build CPython 3.8–3.12 wheels on two native runners: Linux manylinux_2_28 x86_64 (via gcc-toolset-14, new enough for C++20) and macOS arm64 (macos-14, Apple Silicon). Intel macOS is source-install only for now — the GitHub Actions macos-13 runner label was retired during v0.1.2, and Intel macOS users can still pip install galahad-temporal to compile from the sdist locally. When a verified Intel macOS runner label lands, it will be added back to the matrix.
  • Runs python/test_galahad.py against every built wheel via CIBW_TEST_COMMAND before the wheel leaves the runner. The timezone regression that cost us 0.1.1 would have been caught here.
  • Builds the sdist via python -m build --sdist on a fourth runner so pip install galahad-temporal from source still works on any platform including Windows (via user-side compilation).
  • Publishes every wheel plus the sdist to PyPI via trusted publishing — no API token stored in GitHub secrets.

One-time PyPI trusted-publishing setup

Before the first automated release, you need to tell PyPI to trust this workflow. This is a one-time click-through:

  1. Go to https://pypi.org/manage/project/galahad-temporal/settings/publishing/
  2. Add a new "Trusted publisher" with:
    • Owner: JacobFlorio
    • Repository name: GALAHAD-TemporalSlicer
    • Workflow name: wheels.yml
    • Environment name: pypi
  3. Save.

After that, every git push origin v* tag triggers a full wheel build and publish without any manual intervention. Future releases become git tag -a v0.1.2 -m … && git push --tags.

Until the trusted-publisher is configured, the publish_to_pypi job will fail, but the build_wheels and build_sdist jobs still run and upload their artifacts to the GitHub Actions run page. You can download those manually and twine upload them if needed.

License

MIT. See LICENSE. Copyright (c) 2026 Jacob T. Florio.

Third-party dependency: nlohmann/json v3.11.3, pulled via CMake FetchContent, also MIT-licensed.

Project scope

This is a one-author research project. Contributions, discussion, and use cases are welcome but there is no formal contribution process yet. If you are building a system that needs temporal reasoning and this is interesting to you, open an issue describing the use case — the API is young enough that concrete workloads can still shape it.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

galahad_temporal-0.2.1.tar.gz (78.9 kB view details)

Uploaded Source

Built Distributions

If you're not sure about the file name format, learn more about wheel file names.

galahad_temporal-0.2.1-cp312-cp312-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl (321.2 kB view details)

Uploaded CPython 3.12manylinux: glibc 2.27+ x86-64manylinux: glibc 2.28+ x86-64

galahad_temporal-0.2.1-cp312-cp312-macosx_11_0_arm64.whl (251.3 kB view details)

Uploaded CPython 3.12macOS 11.0+ ARM64

galahad_temporal-0.2.1-cp311-cp311-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl (321.9 kB view details)

Uploaded CPython 3.11manylinux: glibc 2.27+ x86-64manylinux: glibc 2.28+ x86-64

galahad_temporal-0.2.1-cp311-cp311-macosx_11_0_arm64.whl (250.4 kB view details)

Uploaded CPython 3.11macOS 11.0+ ARM64

galahad_temporal-0.2.1-cp310-cp310-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl (320.6 kB view details)

Uploaded CPython 3.10manylinux: glibc 2.27+ x86-64manylinux: glibc 2.28+ x86-64

galahad_temporal-0.2.1-cp310-cp310-macosx_11_0_arm64.whl (249.3 kB view details)

Uploaded CPython 3.10macOS 11.0+ ARM64

galahad_temporal-0.2.1-cp39-cp39-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl (320.5 kB view details)

Uploaded CPython 3.9manylinux: glibc 2.27+ x86-64manylinux: glibc 2.28+ x86-64

galahad_temporal-0.2.1-cp39-cp39-macosx_11_0_arm64.whl (249.4 kB view details)

Uploaded CPython 3.9macOS 11.0+ ARM64

galahad_temporal-0.2.1-cp38-cp38-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl (320.2 kB view details)

Uploaded CPython 3.8manylinux: glibc 2.27+ x86-64manylinux: glibc 2.28+ x86-64

galahad_temporal-0.2.1-cp38-cp38-macosx_11_0_arm64.whl (249.2 kB view details)

Uploaded CPython 3.8macOS 11.0+ ARM64

File details

Details for the file galahad_temporal-0.2.1.tar.gz.

File metadata

  • Download URL: galahad_temporal-0.2.1.tar.gz
  • Upload date:
  • Size: 78.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for galahad_temporal-0.2.1.tar.gz
Algorithm Hash digest
SHA256 a03a0ae03c41258e920f5781c54aa074a054e87db2059a7c16d98d28b3681b46
MD5 dce94184b6952b2201ca978c6e2fb317
BLAKE2b-256 44776b14e7a4ef78a115590bbc3c47387fc94db8374233d85b52b4190a3f7e51

See more details on using hashes here.

Provenance

The following attestation bundles were made for galahad_temporal-0.2.1.tar.gz:

Publisher: wheels.yml on JacobFlorio/GALAHAD-TemporalSlicer

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file galahad_temporal-0.2.1-cp312-cp312-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for galahad_temporal-0.2.1-cp312-cp312-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 ca1a71195bd78ad0aeb327bf6dbbceffbbdd41921a0a4a5f11634b8afa2d78ec
MD5 a0999c513eaa9f62d778d89a6c2b218e
BLAKE2b-256 806a755645ac6f6678ad94cafe0365e81c2057b89ebc8b0f76fc895df2751179

See more details on using hashes here.

Provenance

The following attestation bundles were made for galahad_temporal-0.2.1-cp312-cp312-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl:

Publisher: wheels.yml on JacobFlorio/GALAHAD-TemporalSlicer

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file galahad_temporal-0.2.1-cp312-cp312-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for galahad_temporal-0.2.1-cp312-cp312-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 fd3b724fbbfb9ef768fb58978c4782c3b654c57783c5bbceeb3b0e5979393e69
MD5 8a322a1af9f469670434e139d0a7b177
BLAKE2b-256 b4d7d6be04345f979141da6b5baec38718bfdf5c7cc1b1030ad9729f4e93c831

See more details on using hashes here.

Provenance

The following attestation bundles were made for galahad_temporal-0.2.1-cp312-cp312-macosx_11_0_arm64.whl:

Publisher: wheels.yml on JacobFlorio/GALAHAD-TemporalSlicer

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file galahad_temporal-0.2.1-cp311-cp311-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for galahad_temporal-0.2.1-cp311-cp311-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 a188dbd1f84d5ef4f55b04053ceedd9e88b2c106d29283569dc65da3f3854645
MD5 c808248dc13e9dc00e300e4c528abfe5
BLAKE2b-256 3980e85dffa6d654ef5118699901e4f6532c7b7e7951844f21cc445ba03ee86d

See more details on using hashes here.

Provenance

The following attestation bundles were made for galahad_temporal-0.2.1-cp311-cp311-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl:

Publisher: wheels.yml on JacobFlorio/GALAHAD-TemporalSlicer

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file galahad_temporal-0.2.1-cp311-cp311-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for galahad_temporal-0.2.1-cp311-cp311-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 10da0ca9d41099b0a8c93c34ad499eac6f8382f68055aba23002e4de1471ba09
MD5 bbd013e269abb857cd1170a6e01d925b
BLAKE2b-256 5ba08053b08b2b6e5e42fbb8b1eecc68bfb0f0eda127da1fcf04d54fdbe680d6

See more details on using hashes here.

Provenance

The following attestation bundles were made for galahad_temporal-0.2.1-cp311-cp311-macosx_11_0_arm64.whl:

Publisher: wheels.yml on JacobFlorio/GALAHAD-TemporalSlicer

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file galahad_temporal-0.2.1-cp310-cp310-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for galahad_temporal-0.2.1-cp310-cp310-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 240ca44befb53b31f27ac35b27f43c97816020961e104617589fca4a2cf70ae6
MD5 bec47a5fcbb1f67ba90f88af2e5d3442
BLAKE2b-256 b2f6f478a8dc93c30ba3b31f3ed35bbf61d27d42b6fb25b21b42599c3b85760f

See more details on using hashes here.

Provenance

The following attestation bundles were made for galahad_temporal-0.2.1-cp310-cp310-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl:

Publisher: wheels.yml on JacobFlorio/GALAHAD-TemporalSlicer

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file galahad_temporal-0.2.1-cp310-cp310-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for galahad_temporal-0.2.1-cp310-cp310-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 512c79d7240ecbd092b48a2391fdb18955799eef090ede5a7bf96b9f90b2d267
MD5 caf45844231d323924c01c39e5354141
BLAKE2b-256 20396547a56df23a941e1323ac02c7ac49d31fd49caf6d1a3993012477afb8b3

See more details on using hashes here.

Provenance

The following attestation bundles were made for galahad_temporal-0.2.1-cp310-cp310-macosx_11_0_arm64.whl:

Publisher: wheels.yml on JacobFlorio/GALAHAD-TemporalSlicer

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file galahad_temporal-0.2.1-cp39-cp39-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for galahad_temporal-0.2.1-cp39-cp39-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 80d8de9aacd2c22725de9aa789001b67e6eda28f9967795cd78e602f24599ed6
MD5 11aaa3381ce038074c90fe2d91b2b5df
BLAKE2b-256 7d27913956137fc173f090a3109d1a755457cd84080da572c513709771c7b1bb

See more details on using hashes here.

Provenance

The following attestation bundles were made for galahad_temporal-0.2.1-cp39-cp39-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl:

Publisher: wheels.yml on JacobFlorio/GALAHAD-TemporalSlicer

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file galahad_temporal-0.2.1-cp39-cp39-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for galahad_temporal-0.2.1-cp39-cp39-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 f988b53685d4e146dbb7a3f8adf808bd704e6471f0d41da45516f74c0e97c531
MD5 81bce130686c30ce601d14eab4105173
BLAKE2b-256 309112711c89ac6cbb14d87b782fde5dcbabbdf64244f0a75184c0fdec847a0f

See more details on using hashes here.

Provenance

The following attestation bundles were made for galahad_temporal-0.2.1-cp39-cp39-macosx_11_0_arm64.whl:

Publisher: wheels.yml on JacobFlorio/GALAHAD-TemporalSlicer

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file galahad_temporal-0.2.1-cp38-cp38-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for galahad_temporal-0.2.1-cp38-cp38-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 65030d06862f0e2192cb0de429642ab914c55402c059142fda6d393f4c4f8f90
MD5 4bd7de9628713cbd3c54826633d186c5
BLAKE2b-256 f130de3a582b24d86aa455f1158410d324e1a8132c1bf1ee2e11d9a28dd0695c

See more details on using hashes here.

Provenance

The following attestation bundles were made for galahad_temporal-0.2.1-cp38-cp38-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl:

Publisher: wheels.yml on JacobFlorio/GALAHAD-TemporalSlicer

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file galahad_temporal-0.2.1-cp38-cp38-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for galahad_temporal-0.2.1-cp38-cp38-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 36ea570eb6211c262bf0ae21b0bf5f29825fe73e628665f42bd5f56ec191ce58
MD5 5e5e4cd4dca919225bef6585105308a7
BLAKE2b-256 44827190fd8259694cfc58f90690d2196fdc4c9799aae605ebdd89c31b10dc54

See more details on using hashes here.

Provenance

The following attestation bundles were made for galahad_temporal-0.2.1-cp38-cp38-macosx_11_0_arm64.whl:

Publisher: wheels.yml on JacobFlorio/GALAHAD-TemporalSlicer

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page