Skip to main content

Reverse engineer transformer models.

Project description

Deformers apehex logo

License Latest

Experiments with modular neural networks by patching pre-trained LLMs.

Original Model

All experiments use the open-source model qwen/qwen3.5-9b.

Relevant configuration:

  • hidden size: 4096
  • number of layers: 32
  • vocabulary size: 248320
  • embedding size: 4096
  • positional encoding: rotary
  • embedding weights are not tied with the output head

The embedding layer and output head each contain approximately:

4096 × 248320 ~ 1.02B parameters

Patching Layers

Composite Embedding (Prefix Patch)

Frozen components

  • tokenization (Qwen BPE tokenizer)
  • transformer trunk (all the hidden transformer layers)
  • positional encoding
  • output head

Replaced component

The original token embedding Embedding(V=248320, D=4096) is replaced with a composite embedding layer with:

  • a group dimension of 32
  • an input dimension of 256 (byte values)
  • an embedding dimension of 128
  • for a total of 256 * 128 = 32768 parameters

So that a tensor of shape (B, 32 * S) is processed into (B, S, 32 * 128) = (B, S, 4096).

Then a regular transformer block maps these composite embeddings with the original emebddings of the Qwen model.

Input representation

Input text is tokenized using the original Qwen tokenizer.

Each token string is encoded as UTF-8 bytes.

Shorter tokens are then padded into a block of 32 bytes and the longer tokens are truncated.

Patch Training

The training was performed on a multilingual corpora with a custom loss:

$$L_{k} = || H_{k, patch}(x) − H_{k, qwen}(x) ||^{2}$$

Where:

  • $k$ is the depth inside the original Qwen 3.5 model
  • $H_{k, qwen}$ is the hidden state at depth $k$ in the original model
  • $H_{k, patch}$ is the hiden state obtained when replacing the embedding layer

Hierarchical Softmax Head

Frozen components

  • tokenizer
  • embedding layer
  • positional encoding
  • transformer trunk (all the hidden layers)

Replaced component

Original output head Linear(4096 => 248320) is replaced by a hierarchical softmax tree.

Tokens are organized in a binary tree with a depth of 18 ~ log_2(248320)

Each token corresponds to a unique path from root to leaf.

Patch Training

Here a simple cross entropy loss is enough.

License

Licensed under the aGPLv3.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

deformers-0.2.3.tar.gz (19.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

deformers-0.2.3-py3-none-any.whl (21.6 kB view details)

Uploaded Python 3

File details

Details for the file deformers-0.2.3.tar.gz.

File metadata

  • Download URL: deformers-0.2.3.tar.gz
  • Upload date:
  • Size: 19.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.11.4 {"installer":{"name":"uv","version":"0.11.4","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Arch Linux","version":null,"id":null,"libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":null}

File hashes

Hashes for deformers-0.2.3.tar.gz
Algorithm Hash digest
SHA256 442bd5de2c5d1af83740413885c0374f76ea68971e2cac56d4b7f1e926c7fb5b
MD5 4e611ce517ee7d8c924de3738cfaee5d
BLAKE2b-256 e533b16dcba54a59c4b1d931ab21e3359d69c4fb414703cb4cb8ede7f723e1d0

See more details on using hashes here.

File details

Details for the file deformers-0.2.3-py3-none-any.whl.

File metadata

  • Download URL: deformers-0.2.3-py3-none-any.whl
  • Upload date:
  • Size: 21.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.11.4 {"installer":{"name":"uv","version":"0.11.4","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Arch Linux","version":null,"id":null,"libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":null}

File hashes

Hashes for deformers-0.2.3-py3-none-any.whl
Algorithm Hash digest
SHA256 18271776bcbacdd490b5a7c3e0e95569b6c259904e13190bc4eb11a5a61de697
MD5 565e1df01aba05aacd1d7498087be3f3
BLAKE2b-256 b2198ff5852511dc79538827734b46bfc4ae19474f5761c5dfd80539367b194a

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page