The world's first self-crystallizing neural training acceleration engine
Project description
NexusTrain ⚡
The World's First Self-Crystallizing Neural Training Engine
Yazar: Ömür Bera Işık | Lisans: MIT | Python 3.9+
pip install nexustrain
pip install nexustrain[full] # + HuggingFace ekosistemi
pip install nexustrain[all] # her şey
10 Özgün Teknoloji
| Modül | Ne yapar? |
|---|---|
| CrystalCore™ | Sık çalışan operasyonları runtime'da otomatik kristalize eder (torch.compile) |
| MorphicMemory™ | Markov zinciriyle allocation'ları tahmin eder, ölü tensor'ları yeniden doğurur |
| SpectraOptimizer™ | Gradient geçmişine FFT uygular, rezonant frekansları söndürür |
| ResonanceScheduler™ | Gradient spektrum entropisiyle eğitim fazını otomatik tespit eder |
| ChromaticPrecision™ | Her katmana gradient varyansına göre ayrı dtype atar (FP32/BF16/FP16) |
| GradientHarmonics™ | Haar wavelet ile gradient'i ayrıştırır, frekans alanında gürültü enjekte eder |
| NeuralProfiler™ | Küçük LSTM ile OOM ve gradient patlamasını adım öncesi tahmin eder |
| CrystalPipeline™ | VRAM'e göre gradient accumulation'ı dinamik ayarlar, async checkpoint |
| ZeroWaste™ | Atık operasyonları ve sıfır gradient alan parametreleri tespit eder |
| UniversalAdapter™ | HF/TRL/PEFT/DeepSpeed/Accelerate/vanilla PyTorch ile otomatik uyum |
Hızlı Başlangıç
HuggingFace / TRL ile
from nexustrain import NexusTrain, NexusConfig
from trl import SFTTrainer, SFTConfig
# Model yükle
from transformers import AutoModelForCausalLM, AutoTokenizer
model = AutoModelForCausalLM.from_pretrained("meta-llama/Llama-3.2-1B")
tokenizer = AutoTokenizer.from_pretrained("meta-llama/Llama-3.2-1B")
# NexusTrain konfigürasyonu
cfg = NexusConfig(power=1.0)
nt = NexusTrain(model, tokenizer, cfg)
# Trainer oluştur ve bağla
trainer = SFTTrainer(model=model, tokenizer=tokenizer,
train_dataset=dataset, args=SFTConfig(...))
trainer = nt.patch_trainer(trainer)
nt.engage()
trainer.train()
nt.summary()
Tek satır sarmalayıcı
from nexustrain import nexus_wrap
nt, model = nexus_wrap(model, tokenizer, power=1.0)
trainer = nt.patch_trainer(trainer)
nt.engage()
Sadece SpectraOptimizer
from nexustrain import SpectraOptimizer
optimizer = SpectraOptimizer(
model.parameters(),
lr = 2e-4,
spectral_damping = 0.1,
)
Sadece ResonanceScheduler
from nexustrain import SpectraOptimizer, ResonanceScheduler, NexusConfig
opt = SpectraOptimizer(model.parameters(), lr=2e-4)
cfg = NexusConfig(base_lr=2e-4, warmup_steps=100)
sched = ResonanceScheduler(opt, cfg)
# Her adımda
sched.step(loss=loss.item(), grad_norm=gnorm)
Konfigürasyon
cfg = NexusConfig(
power = 1.0, # 0.0–1.0 global güç
# Hız
crystal_core = True,
morphic_memory = True,
crystal_pipeline = True,
zero_waste = True,
# Zeka
spectra_optimizer = True,
resonance_scheduler = True,
chromatic_precision = True,
gradient_harmonics = True,
harmonic_noise = 0.001, # genelleme için
harmonic_compress = 0.0, # dağıtık eğitim için
# İzleme
neural_profiler = True,
profiler_alert_ms = 500.0,
# LR
base_lr = 2e-4,
min_lr = 1e-6,
warmup_steps = 100,
resonance_sensitivity = 0.5,
# Genel
device = "auto", # cuda/cpu otomatik
dtype = "auto", # bf16/fp16/fp32 otomatik
seed = 42,
output_dir = "./nexus_output",
)
Ortam Kontrolü
nexustrain-check
veya Python'dan:
from nexustrain import nexus_check
nexus_check()
Kurulum Seçenekleri
pip install nexustrain # Sadece PyTorch gerekli
pip install nexustrain[full] # + transformers, peft, trl, bitsandbytes
pip install nexustrain[speed] # + flash-attn, triton
pip install nexustrain[distributed] # + deepspeed, accelerate
pip install nexustrain[tuning] # + optuna
pip install nexustrain[logging] # + wandb, tensorboard
pip install nexustrain[all] # her şey
pip install nexustrain[dev] # geliştirici araçları
Test
pip install nexustrain[dev]
pytest tests/ -v
Lisans
MIT © 2025 Ömür Bera Işık — github.com/fastloraoffical
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
nexustrain-1.0.0.tar.gz
(30.7 kB
view details)
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file nexustrain-1.0.0.tar.gz.
File metadata
- Download URL: nexustrain-1.0.0.tar.gz
- Upload date:
- Size: 30.7 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
c42ab33292d2b6fb95d841770027e2e54c368dea9f59c55c119e0b867b026e04
|
|
| MD5 |
bc10a6ae394394eb9899e0529fba4b6e
|
|
| BLAKE2b-256 |
a71454f9bab95d18ab2d0cae891586d520284fb178dd0bcac04c052b569add63
|
File details
Details for the file nexustrain-1.0.0-py3-none-any.whl.
File metadata
- Download URL: nexustrain-1.0.0-py3-none-any.whl
- Upload date:
- Size: 34.3 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
22ff66f7057cdb664013f4b09feb715935b966e312dadbee3aa7136ee0b41152
|
|
| MD5 |
4b594c6696ed2fd93b37010ee71c4d59
|
|
| BLAKE2b-256 |
6aeb3874118b583dd7362156429e00eda122f614625abab5638e244ecafa71b9
|