๐ The Most Powerful AI Framework: Build & Fine-Tune ChatGPT-Level Models - CPU-Optimized, LoRA/QLoRA, Production-Ready
Project description
๐ NAPLY - The Most Powerful AI Framework
Build ChatGPT-level AI models from scratch. CPU-optimized. Production-ready. Lightweight.
pip install naply --upgrade
โก Quick Start - Just 2 Lines!
Standard Model
import naply
# Train your AI (1 line!)
model = naply.Model("medium")
model.train("my_data/", epochs=5)
# Chat with your AI (1 line!)
response = model.chat("Hello! How are you?")
Powerful Model (All 10 Methods!)
from naply.powerful_text_model import PowerfulTextModel
# Most powerful model with all 10 training methods
model = PowerfulTextModel("medium")
model.train("my_data/", epochs=10, method="all") # Uses all 10 methods!
# Natural language generation (no gibberish!)
response = model.chat("Hello! How are you?")
That's it! Super simple! ๐
Even Simpler - 1 Function!
import naply.easy as ai
# Train and chat in one go!
ai.quick_start("my_data/", epochs=5)
# Then just type your questions!
๐ NEW: Fine-tuning Framework (v4.3.0)
Fine-tune any AI model (LLaMA, GPT, Unsloth, etc.) on your own data with LoRA/QLoRA efficiency.
One-Liner Fine-tuning
import naply
# Fine-tune a base model on your data (auto-saves to "finetune_output")
# Supports: jsonl, csv, txt, parquet, folders
model = naply.finetune("unsloth/llama-3.2-1B", "my_data.jsonl", epochs=3)
# Chat with it
response = naply.chat(model, "Hello!")
Advanced Control (LoRA & QLoRA)
model = naply.load_for_finetune("small") # or "path/to/model"
naply.add_lora(model, rank=8, alpha=16)
naply.train_lora(model, "data.csv", epochs=5)
naply.merge_and_save(model, "final_model")
๐ Why NAPLY?
| Feature | NAPLY | PyTorch | TensorFlow | Transformers |
|---|---|---|---|---|
| From Scratch | โ | โ | โ | โ |
| CPU Optimized | โ | โ ๏ธ | โ ๏ธ | โ ๏ธ |
| Lightweight | โ (2MB) | โ (500MB+) | โ (1GB+) | โ (200MB+) |
| No Dependencies | โ | โ | โ | โ |
| 10 Training Methods | โ | โ | โ | โ |
| Micro-Specialists | โ (v4.0) | โ | โ | โ |
| ChatGPT-Level | โ | โ ๏ธ | โ ๏ธ | โ ๏ธ |
| Natural English | โ | โ ๏ธ | โ ๏ธ | โ ๏ธ |
๐ฏ Key Features
๐ Most Powerful Framework (v4.0 Octane)
- 6-Specialist Cluster: Syntax, Lexicon, Logic, Thought, Code, and Tech micro-specialists.
- 10 Advanced Training Methods: CRC, DCL, ILC, MCU, P3, PPL, PTL, RDL, S3L, SGL.
- Extreme Intelligence: 32,000 vocab (BPE) for deep semantic understanding.
- Parallel Training (PTL): Multi-threaded CPU execution for 2x faster learning.
โก Lightweight & Fast
- 2MB package size (vs 500MB+ for PyTorch)
- CPU-optimized - Runs on any laptop
- No GPU required - Pure NumPy implementation
- Fast inference - Optimized tokenization
๐จ Production-Ready
- PyTorch-like API - Familiar interface
- Automatic differentiation - Full autograd engine
- Checkpointing - Save/resume training
- CLI tools -
naply train,naply chat - Standalone - Works after training without datasets
๐ง Developer-Friendly
- No limits - Build models of any size (10M to 1B+ params)
- Any dataset - Supports .txt, .json, .jsonl, .csv, folders
- Easy to use - 3 lines of code to get started
- Well documented - Comprehensive guides and examples
๐ฆ Installation
pip install naply
That's it! Now use it super simply:
import naply.easy as ai
# Build AI in 1 line!
model = ai.train("my_data/", epochs=5)
# Chat in 1 line!
response = ai.chat("Hello!")
Install from Source (Development)
git clone https://github.com/naply-ai/naply.git
cd naply
pip install -e .
๐ฏ Super Simple API
The Easiest Way (Recommended)
import naply.easy as ai
# Train (1 line!)
model = ai.train("my_data/", epochs=10)
# Chat (1 line!)
response = ai.chat("Hello!")
# Build (1 line!)
model = ai.build("medium")
Advanced API (More Control)
import naply
# Create model
model = naply.Model("medium")
# Train
model.train("my_data/", epochs=10)
# Chat
model.chat("Hello!")
๐๏ธ Model Architectures
Using Presets
import naply
# Quick presets - choose your size
model = naply.Model("tiny") # ~1M params, fast testing
model = naply.Model("small") # ~10M params
model = naply.Model("medium") # ~50M params
model = naply.Model("large") # ~100M params
model = naply.Model("xl") # ~300M params
model = naply.Model("xxl") # ~1B params
Custom Architecture (No Limits!)
import naply
# Build exactly what you want
model = naply.Model(
layers=24, # Number of transformer layers
heads=16, # Attention heads
embedding=1024, # Embedding dimension
vocab_size=50000, # Vocabulary size
context=4096 # Context length
)
๐ Training Methods
NAPLY includes 10 advanced training methods for optimal learning:
- CRC - Consistency-Retention Compression
- DCL - Domain-Constrained Learning
- ILC - Incremental Learning Consolidation
- MCU - Memory Consolidation Unit
- P3 - Parallel Pipelined Processing
- PPL - Progressive Prompt Learning
- PTL - Parallel Training and Learning (Multi-threaded CPU)
- RDL - Recursive Data Learning
- S3L - Structured Selective Stabilized Learning
- SGL - Sparse Gradient Learning
import naply
model = naply.Model("medium")
# Use advanced training method
trainer = naply.PTLTrainer(model, num_workers=4) # Parallel training
trainer.train(data_loader, epochs=10)
๐ฌ ChatGPT-Level Chat Interface
import naply
# Load trained model
model = naply.Model.load("my_model/")
# Chat with natural English
response = model.chat(
"Hello! How are you?",
max_tokens=200,
temperature=0.8,
top_k=50,
top_p=0.95
)
print(response)
# Output: "Hello! I'm doing well, thank you for asking. How can I help you today?"
Interactive Chat
# Start interactive chat session
model.chat_interactive()
๐ ๏ธ Advanced Features
Automatic Differentiation
import naply
from naply import Tensor
x = Tensor([1.0, 2.0, 3.0], requires_grad=True)
y = x * 2
z = y.sum()
z.backward()
print(x.grad) # [2.0, 2.0, 2.0]
PyTorch-Style API
import naply.nn as nn
class MyModel(nn.Module):
def __init__(self):
super().__init__()
self.linear = nn.Linear(768, 768)
self.norm = nn.LayerNorm(768)
def forward(self, x):
return self.norm(self.linear(x))
Optimizers & Schedulers
from naply.optim import AdamW, OneCycleScheduler
optimizer = AdamW(model.parameters(), lr=1e-4)
scheduler = OneCycleScheduler(optimizer, max_lr=1e-3, total_steps=1000)
Checkpointing
from naply.utils import Checkpoint
ckpt = Checkpoint("checkpoints/")
ckpt.save(model, optimizer, epoch=5, loss=0.5)
# Resume training
metadata = ckpt.load_latest(model, optimizer)
๐ Examples
Train Your First AI
import naply
# Create model
model = naply.Model("small")
# Train on your data
model.train("my_data/", epochs=10, batch_size=32)
# Save model
model.save("my_ai_model/")
# Chat
print(model.chat("Hello!"))
Power AI Training (ChatGPT-Level)
python train_power_ai.py --data "final datasets/" --foundation "foundation_corpus/" --epochs 15
Chat with Power AI
python chat_power_ai.py --model "final datasets/power_ai_model"
๐ฏ Use Cases
- Build custom AI models from scratch
- Train on your own data (any format)
- Create ChatGPT-like assistants
- Educational purposes - Learn how AI works
- Research - Experiment with new architectures
- Production - Deploy lightweight AI models
๐ Documentation
๐ค Contributing
Contributions are welcome! Please see our Contributing Guide.
๐ License
MIT License - see LICENSE file for details.
๐ Acknowledgments
Built with โค๏ธ for the AI community. Inspired by PyTorch, Transformers, and the open-source spirit.
โญ Star Us!
If you find NAPLY useful, please star us on GitHub!
# Install and try it now!
pip install naply
Build the future of AI. One line at a time. ๐
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file naply-4.3.1.tar.gz.
File metadata
- Download URL: naply-4.3.1.tar.gz
- Upload date:
- Size: 118.1 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.13.0
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
77b76a789ed93190036fd0851b683926c09de4e442f56af304c73b72bc5bf553
|
|
| MD5 |
eb6570250b7051a05dcfef50531b4074
|
|
| BLAKE2b-256 |
5bc848bb29397aadb0703d2c1db1a2c02f500129a7ee713e1f4d417b7a29e825
|
File details
Details for the file naply-4.3.1-py3-none-any.whl.
File metadata
- Download URL: naply-4.3.1-py3-none-any.whl
- Upload date:
- Size: 134.8 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.13.0
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
fdf5d07e6874a40335d28bb6c44c1eb93c0d53a64674f01624c919875582d12b
|
|
| MD5 |
d37ef6b7f364f3de88bf922e2a4cc2e8
|
|
| BLAKE2b-256 |
903b08f5ae75a74a0fcd45e6eea560b1979c849f182c930ac966de5153ddac3c
|