Skip to main content

🚀 The Most Powerful AI Framework: Build ChatGPT-Level Models From Scratch - CPU-Optimized, Lightweight, Production-Ready

Project description

🚀 NAPLY - The Most Powerful AI Framework

PyPI version Python 3.8+ License: MIT Downloads

Build ChatGPT-level AI models from scratch. CPU-optimized. Production-ready. Lightweight.

pip install naply --upgrade

⚡ Quick Start - Just 2 Lines!

Standard Model

import naply

# Train your AI (1 line!)
model = naply.Model("medium")
model.train("my_data/", epochs=5)

# Chat with your AI (1 line!)
response = model.chat("Hello! How are you?")

Powerful Model (All 10 Methods!)

from naply.powerful_text_model import PowerfulTextModel

# Most powerful model with all 10 training methods
model = PowerfulTextModel("medium")
model.train("my_data/", epochs=10, method="all")  # Uses all 10 methods!

# Natural language generation (no gibberish!)
response = model.chat("Hello! How are you?")

That's it! Super simple! 🎉

Even Simpler - 1 Function!

import naply.easy as ai

# Train and chat in one go!
ai.quick_start("my_data/", epochs=5)
# Then just type your questions!

🌟 Why NAPLY?

Feature NAPLY PyTorch TensorFlow Transformers
From Scratch
CPU Optimized ⚠️ ⚠️ ⚠️
Lightweight ✅ (2MB) ❌ (500MB+) ❌ (1GB+) ❌ (200MB+)
No Dependencies
10 Training Methods
Micro-Specialists ✅ (v4.0)
ChatGPT-Level ⚠️ ⚠️ ⚠️
Natural English ⚠️ ⚠️ ⚠️

🎯 Key Features

🚀 Most Powerful Framework (v4.0 Octane)

  • 6-Specialist Cluster: Syntax, Lexicon, Logic, Thought, Code, and Tech micro-specialists.
  • 10 Advanced Training Methods: CRC, DCL, ILC, MCU, P3, PPL, PTL, RDL, S3L, SGL.
  • Extreme Intelligence: 32,000 vocab (BPE) for deep semantic understanding.
  • Parallel Training (PTL): Multi-threaded CPU execution for 2x faster learning.

Lightweight & Fast

  • 2MB package size (vs 500MB+ for PyTorch)
  • CPU-optimized - Runs on any laptop
  • No GPU required - Pure NumPy implementation
  • Fast inference - Optimized tokenization

🎨 Production-Ready

  • PyTorch-like API - Familiar interface
  • Automatic differentiation - Full autograd engine
  • Checkpointing - Save/resume training
  • CLI tools - naply train, naply chat
  • Standalone - Works after training without datasets

🔧 Developer-Friendly

  • No limits - Build models of any size (10M to 1B+ params)
  • Any dataset - Supports .txt, .json, .jsonl, .csv, folders
  • Easy to use - 3 lines of code to get started
  • Well documented - Comprehensive guides and examples

📦 Installation

pip install naply

That's it! Now use it super simply:

import naply.easy as ai

# Build AI in 1 line!
model = ai.train("my_data/", epochs=5)

# Chat in 1 line!
response = ai.chat("Hello!")

Install from Source (Development)

git clone https://github.com/naply-ai/naply.git
cd naply
pip install -e .

🎯 Super Simple API

The Easiest Way (Recommended)

import naply.easy as ai

# Train (1 line!)
model = ai.train("my_data/", epochs=10)

# Chat (1 line!)
response = ai.chat("Hello!")

# Build (1 line!)
model = ai.build("medium")

Advanced API (More Control)

import naply

# Create model
model = naply.Model("medium")

# Train
model.train("my_data/", epochs=10)

# Chat
model.chat("Hello!")

🏗️ Model Architectures

Using Presets

import naply

# Quick presets - choose your size
model = naply.Model("tiny")    # ~1M params, fast testing
model = naply.Model("small")   # ~10M params
model = naply.Model("medium")  # ~50M params
model = naply.Model("large")   # ~100M params
model = naply.Model("xl")      # ~300M params
model = naply.Model("xxl")     # ~1B params

Custom Architecture (No Limits!)

import naply

# Build exactly what you want
model = naply.Model(
    layers=24,        # Number of transformer layers
    heads=16,         # Attention heads
    embedding=1024,   # Embedding dimension
    vocab_size=50000, # Vocabulary size
    context=4096      # Context length
)

🎓 Training Methods

NAPLY includes 10 advanced training methods for optimal learning:

  1. CRC - Consistency-Retention Compression
  2. DCL - Domain-Constrained Learning
  3. ILC - Incremental Learning Consolidation
  4. MCU - Memory Consolidation Unit
  5. P3 - Parallel Pipelined Processing
  6. PPL - Progressive Prompt Learning
  7. PTL - Parallel Training and Learning (Multi-threaded CPU)
  8. RDL - Recursive Data Learning
  9. S3L - Structured Selective Stabilized Learning
  10. SGL - Sparse Gradient Learning
import naply

model = naply.Model("medium")

# Use advanced training method
trainer = naply.PTLTrainer(model, num_workers=4)  # Parallel training
trainer.train(data_loader, epochs=10)

💬 ChatGPT-Level Chat Interface

import naply

# Load trained model
model = naply.Model.load("my_model/")

# Chat with natural English
response = model.chat(
    "Hello! How are you?",
    max_tokens=200,
    temperature=0.8,
    top_k=50,
    top_p=0.95
)

print(response)
# Output: "Hello! I'm doing well, thank you for asking. How can I help you today?"

Interactive Chat

# Start interactive chat session
model.chat_interactive()

🛠️ Advanced Features

Automatic Differentiation

import naply
from naply import Tensor

x = Tensor([1.0, 2.0, 3.0], requires_grad=True)
y = x * 2
z = y.sum()

z.backward()
print(x.grad)  # [2.0, 2.0, 2.0]

PyTorch-Style API

import naply.nn as nn

class MyModel(nn.Module):
    def __init__(self):
        super().__init__()
        self.linear = nn.Linear(768, 768)
        self.norm = nn.LayerNorm(768)
    
    def forward(self, x):
        return self.norm(self.linear(x))

Optimizers & Schedulers

from naply.optim import AdamW, OneCycleScheduler

optimizer = AdamW(model.parameters(), lr=1e-4)
scheduler = OneCycleScheduler(optimizer, max_lr=1e-3, total_steps=1000)

Checkpointing

from naply.utils import Checkpoint

ckpt = Checkpoint("checkpoints/")
ckpt.save(model, optimizer, epoch=5, loss=0.5)

# Resume training
metadata = ckpt.load_latest(model, optimizer)

📚 Examples

Train Your First AI

import naply

# Create model
model = naply.Model("small")

# Train on your data
model.train("my_data/", epochs=10, batch_size=32)

# Save model
model.save("my_ai_model/")

# Chat
print(model.chat("Hello!"))

Power AI Training (ChatGPT-Level)

python train_power_ai.py --data "final datasets/" --foundation "foundation_corpus/" --epochs 15

Chat with Power AI

python chat_power_ai.py --model "final datasets/power_ai_model"

🎯 Use Cases

  • Build custom AI models from scratch
  • Train on your own data (any format)
  • Create ChatGPT-like assistants
  • Educational purposes - Learn how AI works
  • Research - Experiment with new architectures
  • Production - Deploy lightweight AI models

📖 Documentation


🤝 Contributing

Contributions are welcome! Please see our Contributing Guide.


📄 License

MIT License - see LICENSE file for details.


🙏 Acknowledgments

Built with ❤️ for the AI community. Inspired by PyTorch, Transformers, and the open-source spirit.


⭐ Star Us!

If you find NAPLY useful, please star us on GitHub!

# Install and try it now!
pip install naply

Build the future of AI. One line at a time. 🚀

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

naply-4.2.0.tar.gz (96.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

naply-4.2.0-py3-none-any.whl (110.0 kB view details)

Uploaded Python 3

File details

Details for the file naply-4.2.0.tar.gz.

File metadata

  • Download URL: naply-4.2.0.tar.gz
  • Upload date:
  • Size: 96.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.0

File hashes

Hashes for naply-4.2.0.tar.gz
Algorithm Hash digest
SHA256 f7a65a20b6a478d54f494c04c7bb283dcac625202a4284811e520c73795dce5f
MD5 7326c7057b4ba455844dd66ac3ee8419
BLAKE2b-256 bd43b2fc56157878b6e75efa7a74e4a1d12e3ab411e24bd0eae4c00b21c89a0b

See more details on using hashes here.

File details

Details for the file naply-4.2.0-py3-none-any.whl.

File metadata

  • Download URL: naply-4.2.0-py3-none-any.whl
  • Upload date:
  • Size: 110.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.0

File hashes

Hashes for naply-4.2.0-py3-none-any.whl
Algorithm Hash digest
SHA256 56171a0bc54c54e5935ed8e23c5dc948cb9914dcb6f7d3879524abbb7441bb98
MD5 23a18945842d1e1fa9b34a7a44093e6e
BLAKE2b-256 a1c89c4c31bcd2e0d6727019ea04bba1d59bf6ebb802c807abb5032b31d7b318

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page