Skip to main content

OMG-hybridOMGa — Ultimate Hybrid LoRA Suite: LoRA, DoRA, QLoRA, LoRA+, rsLoRA, OMGa ve tam eğitim motoru

Project description

OMG-hybridOMGa ⚡🧠🔮♻️

Ultimate Hybrid LoRA Suite — LoRA, DoRA, QLoRA, LoRA+, rsLoRA ve OMGa'nın tam eğitim motoruyla birleşimi.

PyPI version License: Apache 2.0 Python 3.9+ PyTorch 2.1+


Kurulum

# Temel kurulum (sadece torch + transformers + accelerate)
pip install omg-hybridomga

# Tam kurulum (bitsandbytes, peft, trl, datasets dahil)
pip install "omg-hybridomga[full]"

# Her şey dahil
pip install "omg-hybridomga[all]"

Özellikler

LoRA Katmanları

Yöntem Açıklama Kaynak
LoRA Standard Low-Rank Adaptation Hu et al., 2021
DoRA Weight-Decomposed LoRA Liu et al., 2024
QLoRA 4/8-bit quantized base weights Dettmers et al., 2023
LoRA+ Separate LRs for A and B matrices Hayou et al., 2024
rsLoRA Rank-Stabilized LoRA Kalajdzievski, 2023
OMGa ★ OMG Adaptive LoRA — per-token gate, dual-rank

Bellek & VRAM Yönetimi

  • MemoryMonitor — daemon thread VRAM watchdog
  • VRAMGuard — otomatik batch küçültme + akıllı kurtarma
  • MorphicMemory™ — Markov tahmini + tensor reincarnation
  • 2-bit NF2 quantization (bitsandbytes gerektirmez)

Hız & Derleme

  • Accelerator — grad accum, AMP, clip, fused optimizer
  • CrystalCore™ — runtime kernel kristalizasyonu
  • TritonKernels — RMSNorm, SwiGLU Triton fallback
  • torch.compile — fullgraph + cache desteği

Optimizer & Scheduler

  • EMA / SWA — Exponential/Stochastic Weight Averaging
  • SpectraOptimizer™ — frekans-domain adaptive optimizer
  • ResonanceScheduler™ — gradient-spectrum self-tuning LR
  • WarmupCosineScheduler / WarmupLinearScheduler

Gradient İyileştirme

  • GradientHarmonics™ — wavelet gradient processing
  • NeuralProfiler™ — LSTM-based OOM/explode prediction
  • LossSpikeDetector — spike tespiti + LR müdahalesi

Hızlı Başlangıç

Sadece LoRA Katmanı

from omg_hybridomga import HybridConfig, apply_hybrid_lora

cfg = HybridConfig(method="omga", rank=16, alpha=32)
model = apply_hybrid_lora(model, cfg)

Tam Eğitim Motoru

from omg_hybridomga import HybridOMGa

engine = HybridOMGa(
    "meta-llama/Llama-3.2-3B",
    rank=16,
    method="omga",
    ema=True,
    swa=True,
    crystal_core=True,
    priority_prop=True,
)

model, tokenizer = engine.load()
trainer = engine.get_trainer(dataset)
engine.train(trainer)

LoRA Kaydet / Yükle

from omg_hybridomga import save_hybrid_lora, load_hybrid_lora

save_hybrid_lora(model, "./my_lora_weights")
model = load_hybrid_lora(model, "./my_lora_weights")

Ortam Kontrolü

from omg_hybridomga import check_environment
check_environment()

Opsiyonel Bağımlılıklar

Grup Kurulum İçerik
full pip install "omg-hybridomga[full]" bitsandbytes, peft, trl, datasets
triton pip install "omg-hybridomga[triton]" Triton kernel desteği
flash pip install "omg-hybridomga[flash]" Flash Attention 2
deepspeed pip install "omg-hybridomga[deepspeed]" DeepSpeed ZeRO
all pip install "omg-hybridomga[all]" Hepsi

Gereksinimler

  • Python ≥ 3.9
  • PyTorch ≥ 2.1.0
  • transformers ≥ 4.40.0
  • accelerate ≥ 0.27.0

Lisans

Apache 2.0 — Ayrıntılar için LICENSE dosyasına bakın.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

omg_hybridomga-1.0.1-py3-none-any.whl (44.8 kB view details)

Uploaded Python 3

File details

Details for the file omg_hybridomga-1.0.1-py3-none-any.whl.

File metadata

  • Download URL: omg_hybridomga-1.0.1-py3-none-any.whl
  • Upload date:
  • Size: 44.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.7

File hashes

Hashes for omg_hybridomga-1.0.1-py3-none-any.whl
Algorithm Hash digest
SHA256 bba9e3b539d511e685f59e5f72e24119fe28f5d2d9fb61aa319e8090b4ab1a9e
MD5 d5ae888addf51616bc0576ab5f5898d4
BLAKE2b-256 d33ca10c83e26c0f3d878ffe55ac4a0b28e1a2fc212d181798427e0a10ea153f

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page