MLX Python port of Percepta's transformer-vm
Project description
mlx-transformer-vm
mlx-transformer-vm is a standalone Python port of Percepta's
transformer-vm with the
compiler stack kept in Python and the eventual model runtime targeted at MLX.
The current focus is semantic parity for the computation graph, exact evaluator, and WASM machine construction.
Status
Ported now:
- graph DSL in
mlx_transformer_vm/graph/core.py - exact graph evaluator in
mlx_transformer_vm/evaluator.py - WASM machine graph builder in
mlx_transformer_vm/wasm/interpreter.py - reference WASM interpreter in
mlx_transformer_vm/wasm/reference.py - upstream parity harness in
mlx_transformer_vm/parity.py
Still missing:
- scheduler/allocation parity
- analytical weight construction
- MLX transformer runtime
- standard and hull KV caches
- specialization path
- full CLI parity for build/run/compile/specialize
Upstream Mapping
The port stays structurally close to upstream:
| Upstream | This repo |
|---|---|
transformer_vm/graph/core.py |
mlx_transformer_vm/graph/core.py |
transformer_vm/evaluator.py |
mlx_transformer_vm/evaluator.py |
transformer_vm/wasm/interpreter.py |
mlx_transformer_vm/wasm/interpreter.py |
transformer_vm/wasm/reference.py |
mlx_transformer_vm/wasm/reference.py |
transformer_vm/model/weights.py |
mlx_transformer_vm/model/weights.py (planned) |
transformer_vm/model/transformer.py |
mlx_transformer_vm/model/transformer.py (planned) |
transformer_vm/scheduler/milp.py |
mlx_transformer_vm/scheduler/ (planned) |
transformer_vm/compilation/* |
mlx_transformer_vm/compilation/ (planned) |
Development
Install dependencies:
uv sync --extra dev
Run the fast unit tests:
uv run pytest mlx_transformer_vm/tests/test_graph_core.py
Run the parity tests against the upstream repository:
uv run pytest mlx_transformer_vm/tests/test_parity.py
Run the evaluator on a compiled token program:
uv run wasm-eval-mlx /path/to/program.txt
Diff this port against the upstream examples:
uv run tvm-mlx-parity --examples hello addition collatz fibonacci
The parity harness expects the upstream repository at
/Users/tmc/go/src/github.com/Percepta-Core/transformer-vm. Set
TRANSFORMER_VM_UPSTREAM_ROOT to override that path.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file mlx_transformer_vm-0.1.0.tar.gz.
File metadata
- Download URL: mlx_transformer_vm-0.1.0.tar.gz
- Upload date:
- Size: 19.8 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.11.14
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
5bf894e17efee9652e5e890a4eff9afa3700f59dc8c230ab4ca2a852a145c898
|
|
| MD5 |
bcd883b52a88b8e83d7c25d4900bbb50
|
|
| BLAKE2b-256 |
55c1b143e7ef0602c7628d71135c2c1fba7f572c6a6835589c4648e1bba6f689
|
File details
Details for the file mlx_transformer_vm-0.1.0-py3-none-any.whl.
File metadata
- Download URL: mlx_transformer_vm-0.1.0-py3-none-any.whl
- Upload date:
- Size: 23.5 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.11.14
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
43659f57b8ddd3f2c38d4b541d778364abc365b4c7c951f4d34e869ae9b9cefd
|
|
| MD5 |
025b39d416454dba7db69a981134807b
|
|
| BLAKE2b-256 |
20a7c6f70561c6dd0b19e27b94f627e86e2aa7f5a3b2307d8904c318b0951d05
|