Parallel Diffusion Language Model — 66 features
Project description
OMGFormers
Parallel Diffusion Language Model — 66 features
Install
pip install omgformers
What is OMGFormers?
OMGFormers is a research-grade PyTorch library for building Parallel Diffusion Language Models. It provides a modular, composable set of building blocks covering:
- Attention mechanisms — GQA, MLA, Sliding Window, Linear, Block-Sparse, Flash Attention 2, RoPE variants (YaRN, NTK, LongRoPE), ALiBi, T5 relative bias
- Feed-forward layers — SwiGLU, GeGLU, ReGLU, Standard FFN
- Mixture of Experts — Dense MoE, Soft MoE, load-balancing loss
- Diffusion — Mask scheduler, Parallel decoder for masked diffusion LM training
- LoRA / DoRA — Parameter-efficient fine-tuning adapters with merge/save/load
- Training utilities — EMA, Lion optimizer, warm-up cosine schedules, FSDP, gradient checkpointing, checkpoint manager
- Tokenizer — HuggingFace-compatible tokenizer with char-level fallback, special token management, encode/decode batch, mask_tokens for diffusion
- Advanced — KV cache, multi-token prediction, model merging (SLERP, DARE, TIES), reward model, PPO, int8/int4 quantization, GGUF export, RAG context injection, dynamic batching, chunked long-doc attention
Quick Start
import torch
from omgformers import OMGConfig, OMGModel, create_base_model, OMGTokenizer
# Build a small model
cfg = OMGConfig(
vocab_size=32000,
hidden_size=512,
num_layers=6,
num_heads=8,
)
model = OMGModel(cfg)
# Or use the fast initializer
model, cfg = create_base_model(hidden_size=512, num_layers=6)
# Tokenizer
tok = OMGTokenizer.from_pretrained("gpt2") # or char-level fallback
ids = tok.encode("Hello, world!")
print(tok.decode(ids))
Fine-tuning
from omgformers import FineTuneConfig, FineTuner
ft_cfg = FineTuneConfig(method="lora", lora_rank=16, steps=1000)
tuner = FineTuner(model, tokenizer=tok, config=ft_cfg)
tuner.train(train_dataloader)
LoRA
from omgformers import add_lora, merge_lora, save_lora, LoRAConfig
lora_cfg = LoRAConfig(rank=16, alpha=32, target_modules=["q_proj", "v_proj"])
model = add_lora(model, lora_cfg)
# ... train ...
model = merge_lora(model)
save_lora(model, "my_lora_weights/")
Mixture of Experts
from omgformers import MoEConfig, OMGConfig
cfg = OMGConfig(
hidden_size=1024,
num_layers=12,
moe=MoEConfig(num_experts=8, top_k=2, aux_loss_coeff=0.01),
)
What's New in v2.0.6-preview
| Feature | Description |
|---|---|
| #53 | Fast base model initialization (create_base_model) |
| #54 | Fine-tuning engine (FineTuner) |
| #55 | Resume from checkpoint (Trainer.resume_from_checkpoint) |
| #56 | Checkpoint manager (CheckpointManager) |
| #57 | Flash Attention 2 real implementation (flash_attention_forward) |
| #58 | MoE → OMGConfig full integration (MoEConfig) |
| #61 | OMGTokenizer (HF + char-level fallback) |
| #62 | Special token management |
| #63 | Tokenizer save/load |
| #64 | Tokenizer from_pretrained |
| #65 | encode_batch / decode_batch |
| #66 | mask_tokens for diffusion training |
Bug fixes: #T5, #T6, #T7, #Mo1–Mo4, #A1–A5, #M1–M3, #C1–C3
Requirements
- Python ≥ 3.9
- PyTorch ≥ 2.0
- Optional:
transformers,safetensors,flash-attn,bitsandbytes
License
Apache License 2.0 — see LICENSE.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file omgformers-2.0.6.tar.gz.
File metadata
- Download URL: omgformers-2.0.6.tar.gz
- Upload date:
- Size: 87.8 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
0b08a849537e3e8c9ae659a3cade0bb3b16ee1694468218452742bf65c14f3b2
|
|
| MD5 |
2d5dc9dbed75957e6e53a4493e3b4fc8
|
|
| BLAKE2b-256 |
f931701a2ec952f38bbf3105fcc23f65f0d881b43b986d9bed54044c2b76915d
|
File details
Details for the file omgformers-2.0.6-py3-none-any.whl.
File metadata
- Download URL: omgformers-2.0.6-py3-none-any.whl
- Upload date:
- Size: 90.3 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
9fa8be0b77dc56245c182c0287684451bf69353c050acabd7d07ddb5abf46dbe
|
|
| MD5 |
080828b3c42b45c8d2983d83a60315f4
|
|
| BLAKE2b-256 |
c86515fe612bd78bd584040f38dec7e249cc1f0bd54e01697c3dd963e3888028
|