Tensor Logic: a named-index tensor language that unifies logic and tensor computation.
Project description
Tensor Logic
Vibe coded implementation based on Tensor Logic authored by Pedro Domingos
Tensor Logic: a named-index tensor language that unifies neural and symbolic AI in a single, tiny core:
pip install tensorlogic
A program is a set of tensor equations. RHS = joins (implicit
einsum) + projection (sum over indices not in the LHS) + optional nonlinearity.
This repository provides a lightweight Python framework with swappable backends
(Numpy / optional PyTorch / optional JAX) through a thin einsum-driven abstraction.
Highlights
- 🧮 Named indices: write equations with symbolic indices instead of raw axis numbers.
- ➕ Joins & projection: implicit
einsumto multiply tensors on shared indices and sum the rest. - 🧠 Neuro + Symbolic: includes helper utilities for relations (Datalog-like facts), attention, kernels, and small graphical models.
- 🔁 Forward chaining (fixpoint) and backward evaluation of queries.
- 🔌 Backends:
numpybuilt-in;torchandjaxif installed. - 🧪 Tests: cover each section of the paper with compact, didactic examples.
Learning / gradients are supported when the backend has autograd (Torch/JAX). With Numpy backend, you can evaluate programs but not differentiate them.
Quick peek
from tensorlogic import Tensor
# Minimal tensor logic - just like writing math equations!
W = Tensor([[2., -1.],[0.3, 0.7]], ["i","j"], name="W") # 2x2 weights
X = Tensor([1., 3.], ["j"], name="X") # 2 inputs
Y = Tensor([0., 0.], ["i"], name="Y") # output
Y["i"] = (W["i","j"] * X["j"]).step() # einsum 'ij,j->i' + step
result = Y["i"].eval() # evaluate eagerly
print(result.indices, result.data) # ('i',) [0. 1.]
Write equations just like math
from tensorlogic import Tensor
import numpy as np
# Kernel computation (squared dot kernel)
X = Tensor([[1.,2.],[3.,4.]], ["i","j"], name="X")
K = Tensor(np.zeros((2,2)), ["i","i2"], name="K")
K["i","i2"] = (X["i","j"] * X["i2","j"]) ** 2
print("Kernel:", K["i","i2"].eval().numpy())
# Attention mechanism (now using a single pass, no intermediate computation)
X = Tensor(np.array([[0.1, 0.2],[0.3, 0.4],[0.1,0.8]]), ["p","d"], name="X")
WQ = Tensor(np.eye(2), ["dk","d"], name="WQ")
WK = Tensor(np.eye(2), ["dk","d"], name="WK")
WV = Tensor(np.eye(2), ["dv","d"], name="WV")
Query = Tensor(np.zeros((3,2)), ["p","dk"], name="Query")
Key = Tensor(np.zeros((3,2)), ["p","dk"], name="Key")
Val = Tensor(np.zeros((3,2)), ["p","dv"], name="Val")
Comp = Tensor(np.zeros((3,3)), ["p","p2"], name="Comp")
Attn = Tensor(np.zeros((3,2)), ["p","dv"], name="Attn")
Query["p","dk"] = WQ["dk","d"] * X["p","d"]
Key["p","dk"] = WK["dk","d"] * X["p","d"]
Val["p","dv"] = WV["dv","d"] * X["p","d"]
# Compute raw attention scores using deferred evaluation (no intermediate numpy operations)
Comp["p","p2"] = Query["p","dk"] * Key["p2","dk"] # einsum over shared 'dk'
scores = Comp["p","p2"].eval().numpy()
print("Raw scores:", scores)
This compiles to efficient backend einsum on NumPy / PyTorch / JAX.
-
Native symbolic/Datalog style via
Relation:People = Domain(["Alice","Bob","Charlie"]) Parent = Relation("Parent", People, People) Sister = Relation("Sister", People, People) Aunt = Relation("Aunt", People, People) Parent["Bob","Charlie"] = 1 # facts Sister["Alice","Bob"] = 1 Aunt["x","z"] = (Sister["x","y"] * Parent["y","z"]).step() # rule
Facts are stored in the program as Boolean tensors; rules are equations (join + projection + step), and the final relation is the OR of facts and rules.
-
Learnable parameters through
Tensor:# Data tensor X = Tensor(np.random.randn(3, 5), ["i","j"], name="X") # Learnable parameter (Xavier init), marked learnable by default when no init is provided W = Tensor(idxs=["o","i"], sizes=[8, 5], name="W") # Non-learnable parameter (explicit init) C = Tensor(idxs=["i","j"], sizes=[3, 5], name="C", init="zeros", learnable=False)
-
Attention correctness in examples: scaled dot-product (
1/sqrt(dk)), normalized along the comparison axis (softmax(..., axis="p2")). -
Examples updated:
examples/attention.py,examples/symbolic_aunt.py.
Development
Repository is under development
uv run pytest
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file tensorlogic-0.0.4.tar.gz.
File metadata
- Download URL: tensorlogic-0.0.4.tar.gz
- Upload date:
- Size: 27.4 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.9
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
98d5b09f68484afcb8c9857c876d4c0878785edcce0a5fe995902cd960e2eac0
|
|
| MD5 |
9bea09583b73ca9adc5ba7c8fa7ef769
|
|
| BLAKE2b-256 |
bf7563a1e3094992dbdfaa75c34a7161ef972ec9fefeab7e5412ae50edcb3127
|
File details
Details for the file tensorlogic-0.0.4-py3-none-any.whl.
File metadata
- Download URL: tensorlogic-0.0.4-py3-none-any.whl
- Upload date:
- Size: 24.4 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.9
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
85c0375db545570ab6c8be2ad040f270ffd20508e58f823ba2a74a9f232c9deb
|
|
| MD5 |
3427ec5854b7594bbc3832d6cb6336df
|
|
| BLAKE2b-256 |
0ffddf784c3b2483ba75d412849052049164d8e7ac6ddaa311b2a70c5578b093
|