Skip to main content

๐Ÿš€ The Most Powerful AI Framework: Build & Fine-Tune ChatGPT-Level Models - CPU-Optimized, LoRA/QLoRA, Production-Ready

Project description

๐Ÿš€ NAPLY - The Most Powerful AI Framework

PyPI version Python 3.8+ License: MIT Downloads

Build ChatGPT-level AI models from scratch. CPU-optimized. Production-ready. Lightweight.

pip install naply --upgrade

โšก Quick Start - Just 2 Lines!

Standard Model

import naply

# Train your AI (1 line!)
model = naply.Model("medium")
model.train("my_data/", epochs=5)

# Chat with your AI (1 line!)
response = model.chat("Hello! How are you?")

Powerful Model (All 10 Methods!)

from naply.powerful_text_model import PowerfulTextModel

# Most powerful model with all 10 training methods
model = PowerfulTextModel("medium")
model.train("my_data/", epochs=10, method="all")  # Uses all 10 methods!

# Natural language generation (no gibberish!)
response = model.chat("Hello! How are you?")

That's it! Super simple! ๐ŸŽ‰

Even Simpler - 1 Function!

import naply.easy as ai

# Train and chat in one go!
ai.quick_start("my_data/", epochs=5)
# Then just type your questions!

๐Ÿš€ NEW: Fine-tuning Framework (v4.3.0)

Fine-tune any AI model (LLaMA, GPT, Unsloth, etc.) on your own data with LoRA/QLoRA efficiency.

One-Liner Fine-tuning

import naply

# Fine-tune a base model on your data (auto-saves to "finetune_output")
# Supports: jsonl, csv, txt, parquet, folders
model = naply.finetune("unsloth/llama-3.2-1B", "my_data.jsonl", epochs=3)

# Chat with it
response = naply.chat(model, "Hello!")

Advanced Control (LoRA & QLoRA)

model = naply.load_for_finetune("small") # or "path/to/model"
naply.add_lora(model, rank=8, alpha=16)
naply.train_lora(model, "data.csv", epochs=5)
naply.merge_and_save(model, "final_model")

๐ŸŒŸ Why NAPLY?

Feature NAPLY PyTorch TensorFlow Transformers
From Scratch โœ… โŒ โŒ โŒ
CPU Optimized โœ… โš ๏ธ โš ๏ธ โš ๏ธ
Lightweight โœ… (2MB) โŒ (500MB+) โŒ (1GB+) โŒ (200MB+)
No Dependencies โœ… โŒ โŒ โŒ
10 Training Methods โœ… โŒ โŒ โŒ
Micro-Specialists โœ… (v4.0) โŒ โŒ โŒ
ChatGPT-Level โœ… โš ๏ธ โš ๏ธ โš ๏ธ
Natural English โœ… โš ๏ธ โš ๏ธ โš ๏ธ

๐ŸŽฏ Key Features

๐Ÿš€ Most Powerful Framework (v4.0 Octane)

  • 6-Specialist Cluster: Syntax, Lexicon, Logic, Thought, Code, and Tech micro-specialists.
  • 10 Advanced Training Methods: CRC, DCL, ILC, MCU, P3, PPL, PTL, RDL, S3L, SGL.
  • Extreme Intelligence: 32,000 vocab (BPE) for deep semantic understanding.
  • Parallel Training (PTL): Multi-threaded CPU execution for 2x faster learning.

โšก Lightweight & Fast

  • 2MB package size (vs 500MB+ for PyTorch)
  • CPU-optimized - Runs on any laptop
  • No GPU required - Pure NumPy implementation
  • Fast inference - Optimized tokenization

๐ŸŽจ Production-Ready

  • PyTorch-like API - Familiar interface
  • Automatic differentiation - Full autograd engine
  • Checkpointing - Save/resume training
  • CLI tools - naply train, naply chat
  • Standalone - Works after training without datasets

๐Ÿ”ง Developer-Friendly

  • No limits - Build models of any size (10M to 1B+ params)
  • Any dataset - Supports .txt, .json, .jsonl, .csv, folders
  • Easy to use - 3 lines of code to get started
  • Well documented - Comprehensive guides and examples

๐Ÿ“ฆ Installation

pip install naply

That's it! Now use it super simply:

import naply.easy as ai

# Build AI in 1 line!
model = ai.train("my_data/", epochs=5)

# Chat in 1 line!
response = ai.chat("Hello!")

Install from Source (Development)

git clone https://github.com/naply-ai/naply.git
cd naply
pip install -e .

๐ŸŽฏ Super Simple API

The Easiest Way (Recommended)

import naply.easy as ai

# Train (1 line!)
model = ai.train("my_data/", epochs=10)

# Chat (1 line!)
response = ai.chat("Hello!")

# Build (1 line!)
model = ai.build("medium")

Advanced API (More Control)

import naply

# Create model
model = naply.Model("medium")

# Train
model.train("my_data/", epochs=10)

# Chat
model.chat("Hello!")

๐Ÿ—๏ธ Model Architectures

Using Presets

import naply

# Quick presets - choose your size
model = naply.Model("tiny")    # ~1M params, fast testing
model = naply.Model("small")   # ~10M params
model = naply.Model("medium")  # ~50M params
model = naply.Model("large")   # ~100M params
model = naply.Model("xl")      # ~300M params
model = naply.Model("xxl")     # ~1B params

Custom Architecture (No Limits!)

import naply

# Build exactly what you want
model = naply.Model(
    layers=24,        # Number of transformer layers
    heads=16,         # Attention heads
    embedding=1024,   # Embedding dimension
    vocab_size=50000, # Vocabulary size
    context=4096      # Context length
)

๐ŸŽ“ Training Methods

NAPLY includes 10 advanced training methods for optimal learning:

  1. CRC - Consistency-Retention Compression
  2. DCL - Domain-Constrained Learning
  3. ILC - Incremental Learning Consolidation
  4. MCU - Memory Consolidation Unit
  5. P3 - Parallel Pipelined Processing
  6. PPL - Progressive Prompt Learning
  7. PTL - Parallel Training and Learning (Multi-threaded CPU)
  8. RDL - Recursive Data Learning
  9. S3L - Structured Selective Stabilized Learning
  10. SGL - Sparse Gradient Learning
import naply

model = naply.Model("medium")

# Use advanced training method
trainer = naply.PTLTrainer(model, num_workers=4)  # Parallel training
trainer.train(data_loader, epochs=10)

๐Ÿ’ฌ ChatGPT-Level Chat Interface

import naply

# Load trained model
model = naply.Model.load("my_model/")

# Chat with natural English
response = model.chat(
    "Hello! How are you?",
    max_tokens=200,
    temperature=0.8,
    top_k=50,
    top_p=0.95
)

print(response)
# Output: "Hello! I'm doing well, thank you for asking. How can I help you today?"

Interactive Chat

# Start interactive chat session
model.chat_interactive()

๐Ÿ› ๏ธ Advanced Features

Automatic Differentiation

import naply
from naply import Tensor

x = Tensor([1.0, 2.0, 3.0], requires_grad=True)
y = x * 2
z = y.sum()

z.backward()
print(x.grad)  # [2.0, 2.0, 2.0]

PyTorch-Style API

import naply.nn as nn

class MyModel(nn.Module):
    def __init__(self):
        super().__init__()
        self.linear = nn.Linear(768, 768)
        self.norm = nn.LayerNorm(768)
    
    def forward(self, x):
        return self.norm(self.linear(x))

Optimizers & Schedulers

from naply.optim import AdamW, OneCycleScheduler

optimizer = AdamW(model.parameters(), lr=1e-4)
scheduler = OneCycleScheduler(optimizer, max_lr=1e-3, total_steps=1000)

Checkpointing

from naply.utils import Checkpoint

ckpt = Checkpoint("checkpoints/")
ckpt.save(model, optimizer, epoch=5, loss=0.5)

# Resume training
metadata = ckpt.load_latest(model, optimizer)

๐Ÿ“š Examples

Train Your First AI

import naply

# Create model
model = naply.Model("small")

# Train on your data
model.train("my_data/", epochs=10, batch_size=32)

# Save model
model.save("my_ai_model/")

# Chat
print(model.chat("Hello!"))

Power AI Training (ChatGPT-Level)

python train_power_ai.py --data "final datasets/" --foundation "foundation_corpus/" --epochs 15

Chat with Power AI

python chat_power_ai.py --model "final datasets/power_ai_model"

๐ŸŽฏ Use Cases

  • Build custom AI models from scratch
  • Train on your own data (any format)
  • Create ChatGPT-like assistants
  • Educational purposes - Learn how AI works
  • Research - Experiment with new architectures
  • Production - Deploy lightweight AI models

๐Ÿ“– Documentation


๐Ÿค Contributing

Contributions are welcome! Please see our Contributing Guide.


๐Ÿ“„ License

MIT License - see LICENSE file for details.


๐Ÿ™ Acknowledgments

Built with โค๏ธ for the AI community. Inspired by PyTorch, Transformers, and the open-source spirit.


โญ Star Us!

If you find NAPLY useful, please star us on GitHub!

# Install and try it now!
pip install naply

Build the future of AI. One line at a time. ๐Ÿš€

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

naply-4.3.0.tar.gz (117.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

naply-4.3.0-py3-none-any.whl (134.5 kB view details)

Uploaded Python 3

File details

Details for the file naply-4.3.0.tar.gz.

File metadata

  • Download URL: naply-4.3.0.tar.gz
  • Upload date:
  • Size: 117.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.0

File hashes

Hashes for naply-4.3.0.tar.gz
Algorithm Hash digest
SHA256 979a554e7c2bf4faa4143f567bd7b9ff99bd5da18ec00999e8a4febb5a6ee45a
MD5 b84dc60f0605df1fc0c3a81e943615ed
BLAKE2b-256 b065a6b2371bbd90c2223fe8cd43e0df32c3e9293a0ebd5085adcc4292260c19

See more details on using hashes here.

File details

Details for the file naply-4.3.0-py3-none-any.whl.

File metadata

  • Download URL: naply-4.3.0-py3-none-any.whl
  • Upload date:
  • Size: 134.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.0

File hashes

Hashes for naply-4.3.0-py3-none-any.whl
Algorithm Hash digest
SHA256 d360fd3b770ead3f084f79faf2110198f2847c6b9d7107bf268054698fdd6aea
MD5 bd17701ba69faba01a517958e2d2f207
BLAKE2b-256 3e8aeed212576d941df2cb5fb93f417d2bc56885ff44b4993bd79be4a9fd814e

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page