Skip to main content

Reverse engineer transformer models.

Project description

Deformers apehex logo

License Latest

Experiments with modular neural networks by patching pre-trained LLMs.

Original Model

All experiments use the open-source model qwen/qwen3.5-9b.

Relevant configuration:

  • hidden size: 4096
  • number of layers: 32
  • vocabulary size: 248320
  • embedding size: 4096
  • positional encoding: rotary
  • embedding weights are not tied with the output head

The embedding layer and output head each contain approximately:

4096 × 248320 ~ 1.02B parameters

Patching Layers

Composite Embedding (Prefix Patch)

Frozen components

  • tokenization (Qwen BPE tokenizer)
  • transformer trunk (all the hidden transformer layers)
  • positional encoding
  • output head

Replaced component

The original token embedding Embedding(V=248320, D=4096) is replaced with a composite embedding layer with:

  • a group dimension of 32
  • an input dimension of 256 (byte values)
  • an embedding dimension of 128
  • for a total of 256 * 128 = 32768 parameters

So that a tensor of shape (B, 32 * S) is processed into (B, S, 32 * 128) = (B, S, 4096).

Then a regular transformer block maps these composite embeddings with the original emebddings of the Qwen model.

Input representation

Input text is tokenized using the original Qwen tokenizer.

Each token string is encoded as UTF-8 bytes.

Shorter tokens are then padded into a block of 32 bytes and the longer tokens are truncated.

Patch Training

The training was performed on a multilingual corpora with a custom loss:

$$L_{k} = || H_{k, patch}(x) − H_{k, qwen}(x) ||^{2}$$

Where:

  • $k$ is the depth inside the original Qwen 3.5 model
  • $H_{k, qwen}$ is the hidden state at depth $k$ in the original model
  • $H_{k, patch}$ is the hiden state obtained when replacing the embedding layer

Hierarchical Softmax Head

Frozen components

  • tokenizer
  • embedding layer
  • positional encoding
  • transformer trunk (all the hidden layers)

Replaced component

Original output head Linear(4096 => 248320) is replaced by a hierarchical softmax tree.

Tokens are organized in a binary tree with a depth of 18 ~ log_2(248320)

Each token corresponds to a unique path from root to leaf.

Patch Training

Here a simple cross entropy loss is enough.

License

Licensed under the aGPLv3.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

deformers-0.4.0.tar.gz (20.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

deformers-0.4.0-py3-none-any.whl (24.0 kB view details)

Uploaded Python 3

File details

Details for the file deformers-0.4.0.tar.gz.

File metadata

  • Download URL: deformers-0.4.0.tar.gz
  • Upload date:
  • Size: 20.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.11.7 {"installer":{"name":"uv","version":"0.11.7","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Arch Linux","version":null,"id":null,"libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":null}

File hashes

Hashes for deformers-0.4.0.tar.gz
Algorithm Hash digest
SHA256 806a2aeb4d14bf0702feafe5df0c84211d19d3f89177ec6006722f34f71bf9bb
MD5 b7b642328a624eef4faec5c1ce23437b
BLAKE2b-256 a01002d0393f88ed2062e6110e7c4bf03e1955b3f201e35d86fa8043ced30ba4

See more details on using hashes here.

File details

Details for the file deformers-0.4.0-py3-none-any.whl.

File metadata

  • Download URL: deformers-0.4.0-py3-none-any.whl
  • Upload date:
  • Size: 24.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.11.7 {"installer":{"name":"uv","version":"0.11.7","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Arch Linux","version":null,"id":null,"libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":null}

File hashes

Hashes for deformers-0.4.0-py3-none-any.whl
Algorithm Hash digest
SHA256 15433879a08b3bcfa4fbd04d481e43e1f7c409163a0380e8da6bfcaed5d907d6
MD5 cb3b4f65a50a538fd576b55c6a7a20d6
BLAKE2b-256 1cf20d7fa1f39b366acaa64133b5c427bf410b146c615d7d11435d70b347f8ef

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page