A microscope for informed training of multi-layer perceptron, diagnosing training issues at granular level and accelerating learning and rapid prototyping.
Project description
A microscope for neural networks - Comprehensive framework for building, training, and diagnosing multi-layer perceptrons with advanced monitoring and visualization capabilities.
Features
Modern MLP Implementation
- Flexible Architecture: Arbitrary layer sizes with customizable activations
- Advanced Optimizers: 5 production-ready optimizers validated against literature
- SGD: Classic stochastic gradient descent (Robbins & Monro, 1951)
- SGD+Momentum: Polyak momentum for accelerated convergence (Polyak, 1964)
- SGD+Nesterov: Lookahead momentum for superior convergence (Nesterov, 1983)
- RMSprop: Adaptive learning rates (Hinton, 2012) with optional Nesterov
- Adam: Default choice with bias-corrected moments (Kingma & Ba, 2014)
- Smart Initialization: He, Xavier, SELU, and intelligent auto-selection
- Regularization: L2 regularization, dropout with multiple variants
- Model Persistence: Complete save/load system with optimizer state preservation
High-Performance Training
- Ultra-Fast Training:
fit_fast()method with $\approx$ 5-10× speedup over standard training - Memory Efficient: 60-80% memory reduction with optimized batch processing
- Flexible Performance: Choose between speed (
fit_fast()) and diagnostics (fit())
Comprehensive Diagnostics
- Pre-Training Analysis: Architecture validation, weight initialization checks
- Real-Time Monitoring: Dead neuron detection, gradient flow analysis and 8 other moitors
- Post-Training Evaluation: Robustness testing, performance profiling
- Research-Validated Metrics: Based on established deep learning principles
High Quality Visualization
- Training Dynamics: Learning curves, loss landscapes, convergence analysis
- Network Internals: Activation distributions, gradient flows, weight evolution
- Diagnostic Plots: Health indicators, training stability metrics
- Interactive Animations: Training progress visualization
Developer Experience
- Clean API: Intuitive interface with sensible defaults
- Type Safety: Full type hints and runtime validation
- Comprehensive Testing: 60%+ test coverage with property-based testing
- Production Ready: Extensive documentation, CI/CD, and quality assurance
Requirements
- Python: 3.11+ (3.12 recommended)
- Core Dependencies: NumPy 2.3+, Matplotlib 3.10+
- Optional: Jupyter for interactive examples
Quick Start
Installation
# Install from PyPI (recommended)
pip install neuroscope
# Install from source (development)
git clone https://github.com/ahmadrazacdx/neuro-scope.git
cd neuro-scope
pip install -e .
Fast Training
import numpy as np
from neuroscope import MLP
# Create and configure model
model = MLP([784, 128, 64, 10],
hidden_activation="relu",
out_activation="softmax")
# Choose your optimizer: "adam", "sgd", "sgdm", "sgdnm", "rmsprop"
model.compile(optimizer="adam", lr=1e-3)
# Ultra-fast training - ~5-10× speedup!
history = model.fit_fast(
X_train, y_train, X_val, y_val,
epochs=100,
batch_size=256,
eval_freq=5
)
# Save trained model
model.save("my_model.ns", save_optimizer=True)
# Load and use later
loaded_model = MLP.load("my_model.ns", load_optimizer=True)
predictions = loaded_model.predict(X_test)
Full Diagnostic Training
from neuroscope import MLP, PreTrainingAnalyzer, TrainingMonitor, Visualizer
# Create model
model = MLP([784, 128, 64, 10])
model.compile(optimizer="adam", lr=1e-3)
# Pre-training analysis
analyzer = PreTrainingAnalyzer(model)
pre_results = analyzer.analyze(X_train, y_train)
# Train with comprehensive monitoring
monitor = TrainingMonitor()
history = model.fit(X_train, y_train, X_val, y_val,
epochs=100, monitor=monitor)
# Visualize results
viz = Visualizer(history)
viz.plot_learning_curves()
viz.plot_activation_hist()
Direct Function Access
from neuroscope import mse, accuracy_binary, relu, he_init
# Use functions directly without class instantiation
loss = mse(y_true, y_pred)
acc = accuracy_binary(y_true, y_pred)
activated = relu(z)
weights, biases = he_init([784, 128, 10])
Documentation
- Full Documentation: Complete API reference and guides
- Quickstart Guide: Get up and running in minutes
- API Reference: Detailed function and class documentation
- Examples: Jupyter notebooks and scripts
Use Cases
Educational
- Learning Deep Learning: Understand neural network internals with detailed diagnostics
- Research Projects: Rapid prototyping with comprehensive analysis tools
- Teaching: Visual demonstrations of training dynamics and common issues
Research & Development
- Algorithm Development: Test new optimization techniques and architectures
- Proof of Concepts: Quick validation of neural network approaches
- Debugging: Identify and resolve training issues with diagnostic tools
Comparison with Other Frameworks
| Feature | NeuroScope | PyTorch | TensorFlow | Scikit-learn |
|---|---|---|---|---|
| Training Speed | Fast (fit_fast()) |
Fast | Fast | Moderate |
| Learning Focus | Educational + Production | Production | Production | Traditional ML |
| Built-in Diagnostics | Rich | Manual | Manual | Limited |
| Visualization | High Quality | Manual | Manual | Basic |
| Ease of Use | Intuitive | Complex | Complex | Simple |
| MLP Focus | Specialized | General | General | Limited |
Contributing
We welcome contributions! Please see our Contributing Guide for details.
Development Setup
# Clone the repository
git clone https://github.com/ahmadrazacdx/neuro-scope.git
cd neuro-scope
# Set up development environment
make dev
# Run tests
make test
# Build documentation
make docs
License
Distributed under the terms of the Apache 2.0 license, NeuroScope is free and open source software.
Issues & Support
If you encounter any problems:
- File an Issue: Bug reports and feature requests
- Discussions: Questions and community support
- Documentation: Comprehensive guides and API reference
Acknowledgments
We extend our sincere gratitude to the following individuals and works that have profoundly influenced the development of NeuroScope:
Foundational Inspirations
-
Geoffrey Hinton - The Godfather of Deep Learning whose groundbreaking work on neural networks, backpropagation, and deep learning architectures laid the foundation for modern AI.
-
Andrej Karpathy (@karpathy) - His philosophy of "building neural networks from scratch" and granular mastery has been instrumental in shaping NeuroScope's educational approach and commitment to algorithmic transparency.
-
Jeremy Howard (@jph00) - His work has inspired NeuroScope's philosophy of combining educational clarity, literature adherance, compliance with best practices and ease of use.
-
Deep Learning (MIT Press, 2016) (Goodfellow et al.) . This seminal work provided the theoretical foundation and mathematical rigor that underlies NeuroScope's diagnostic capabilities and research-validated implementations.
Technical Contributions
-
Muhammad Talha (@mtalhacdx) - For the elegant logo design and visual identity that captures NeuroScope's beauty of simplicity.
-
GitHub Copilot (Claude Sonnet 4) - For invaluable assistance in documentation generation, comprehensive test suite development, workflows optimization, and guidance throughout the development process.
Research Community
Special recognition to the neural network research community whose decades of theoretical advances and empirical insights have made frameworks like NeuroScope possible. We stand on the shoulders of giants in machine learning, optimization theory, and computational neuroscience.
"NeuroScope is built with modern Python best practices and inspired by the educational philosophy of making neural networks transparent, understandable, and accessible to learners and researchers worldwide."
Contributing
We welcome contributions! Please see our Contributing Guide for details.
Development Setup
# Clone the repository
git clone https://github.com/ahmadrazacdx/neuro-scope.git
cd neuro-scope
# Set up development environment
make dev
# Run tests
make test
# Build documentation
make docs
License
Distributed under the terms of the Apache 2.0 license, NeuroScope is free and open source software.
Issues & Support
If you encounter any problems:
- File an Issue: Bug reports and feature requests
- Discussions: Questions and community support
- Documentation: Comprehensive guides and API reference
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file neuroscope-0.2.2.tar.gz.
File metadata
- Download URL: neuroscope-0.2.2.tar.gz
- Upload date:
- Size: 73.3 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
4f821c9f243c56cfe159606f43fcaf5f5886d50944bd5cd517f788d0c83fdcb3
|
|
| MD5 |
d64e366a46848cf8f0b901af8ac23735
|
|
| BLAKE2b-256 |
cc4470a658e33fe14c0c66e3c314774e714d3fe4e74264fe80d6d2a5ee89fd1b
|
File details
Details for the file neuroscope-0.2.2-py3-none-any.whl.
File metadata
- Download URL: neuroscope-0.2.2-py3-none-any.whl
- Upload date:
- Size: 77.9 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
c8da0a4c37b20558be735fea421ade8ddae02a34ec454f6a2328c518f7ae105c
|
|
| MD5 |
e7e6a350a2d466b9878747bccbeabfee
|
|
| BLAKE2b-256 |
ec3fa2bbe41412af5aacf6bb38fb62749fe4d6d0e8766e66fa0730538d95339c
|