Skip to main content

AI/ML/DL/NLP productivity library for minimal-code machine learning workflows

Project description

NeuroLite 🧠⚡

PyPI version Python 3.8+ License: MIT Coverage

NeuroLite is a revolutionary AI/ML/DL/NLP productivity library that enables you to build, train, and deploy machine learning models with minimal code. Transform complex ML workflows into simple, intuitive operations.

🚀 Why NeuroLite?

  • 🎯 Minimal Code: Train state-of-the-art models in less than 10 lines of code
  • 🤖 Auto-Everything: Automatic data processing, model selection, and hyperparameter tuning
  • 🌍 Multi-Domain: Unified interface for Computer Vision, NLP, and Traditional ML
  • ⚡ Production Ready: One-click deployment to production environments
  • 🔧 Extensible: Plugin system for custom models and workflows
  • 📊 Rich Visualization: Built-in dashboards and reporting tools

📦 Installation

Quick Install

pip install neurolite

Development Install

git clone https://github.com/dot-css/neurolite.git
cd neurolite
pip install -e ".[dev]"

Optional Dependencies

# For TensorFlow support
pip install neurolite[tensorflow]

# For XGBoost support  
pip install neurolite[xgboost]

# Install everything
pip install neurolite[all]

🎯 Quick Start

Image Classification in 3 Lines

from neurolite import train

# Train a computer vision model
model = train(data="path/to/images", task="image_classification")
predictions = model.predict("path/to/new/image.jpg")

Text Classification

from neurolite import train

# Train an NLP model
model = train(data="reviews.csv", task="sentiment_analysis", target="sentiment")
result = model.predict("This product is amazing!")

Tabular Data Prediction

from neurolite import train

# Train on structured data
model = train(data="sales.csv", task="regression", target="revenue")
forecast = model.predict({"feature1": 100, "feature2": "category_a"})

One-Click Deployment

from neurolite import deploy

# Deploy your model instantly
endpoint = deploy(model, platform="cloud", auto_scale=True)
print(f"Model deployed at: {endpoint.url}")

🌟 Key Features

🤖 Automatic Intelligence

  • Auto Data Processing: Handles missing values, encoding, scaling automatically
  • Auto Model Selection: Chooses the best model architecture for your data
  • Auto Hyperparameter Tuning: Optimizes model parameters using advanced algorithms
  • Auto Feature Engineering: Creates and selects relevant features

🎨 Multi-Domain Support

Computer Vision

# Image classification, object detection, segmentation
model = train(data="images/", task="object_detection")
results = model.predict("test_image.jpg")

Natural Language Processing

# Text classification, sentiment analysis, translation
model = train(data="texts.csv", task="text_generation")
generated = model.predict("Once upon a time")

Traditional ML

# Regression, classification, clustering
model = train(data="tabular.csv", task="classification")
predictions = model.predict(new_data)

🚀 Production Deployment

from neurolite import deploy

# Deploy to various platforms
deploy(model, platform="aws")        # AWS Lambda/SageMaker
deploy(model, platform="gcp")        # Google Cloud
deploy(model, platform="azure")      # Azure ML
deploy(model, platform="docker")     # Docker container
deploy(model, platform="kubernetes") # Kubernetes cluster

📊 Advanced Features

Hyperparameter Optimization

from neurolite import train

model = train(
    data="data.csv",
    task="classification",
    optimization="bayesian",  # bayesian, grid, random
    trials=100,
    timeout=3600  # 1 hour
)

Model Ensembles

from neurolite import train

# Automatic ensemble creation
model = train(
    data="data.csv",
    task="regression",
    ensemble=True,
    ensemble_size=5
)

Custom Workflows

from neurolite.workflows import create_workflow

# Define custom ML pipeline
workflow = create_workflow([
    "data_cleaning",
    "feature_engineering", 
    "model_training",
    "evaluation",
    "deployment"
])

result = workflow.run(data="data.csv")

Real-time Monitoring

from neurolite import monitor

# Monitor deployed models
monitor.track(model, metrics=["accuracy", "latency", "drift"])
dashboard = monitor.dashboard(model)

🔧 Configuration

Global Settings

import neurolite

# Configure global settings
neurolite.config.set_device("gpu")  # cpu, gpu, auto
neurolite.config.set_cache_dir("./cache")
neurolite.config.set_log_level("INFO")

Model-Specific Configuration

model = train(
    data="data.csv",
    task="classification",
    config={
        "model_type": "neural_network",
        "epochs": 100,
        "batch_size": 32,
        "learning_rate": 0.001,
        "early_stopping": True
    }
)

📈 Performance Benchmarks

Task Dataset NeuroLite Traditional Approach Time Saved
Image Classification CIFAR-10 3 lines 200+ lines 98.5%
Sentiment Analysis IMDB 2 lines 150+ lines 98.7%
Sales Forecasting Custom 4 lines 300+ lines 98.7%

🛠️ Supported Models

Computer Vision

  • Classification: ResNet, EfficientNet, Vision Transformer
  • Object Detection: YOLO, Faster R-CNN, SSD
  • Segmentation: U-Net, DeepLab, FCN

Natural Language Processing

  • Text Classification: BERT, RoBERTa, DistilBERT
  • Text Generation: GPT-2, T5, BART
  • Translation: MarianMT, T5
  • Question Answering: BERT, RoBERTa

Traditional ML

  • Classification: Random Forest, XGBoost, SVM, Logistic Regression
  • Regression: Linear Regression, Random Forest, Gradient Boosting
  • Clustering: K-Means, DBSCAN, Hierarchical
  • Ensemble: Voting, Stacking, Bagging

🔌 Plugin System

Extend NeuroLite with custom models and workflows:

from neurolite.plugins import register_model

@register_model("my_custom_model")
class CustomModel:
    def train(self, data):
        # Custom training logic
        pass
    
    def predict(self, data):
        # Custom prediction logic
        pass

# Use your custom model
model = train(data="data.csv", model="my_custom_model")

📚 Documentation

🤝 Contributing

We welcome contributions! Please see our Contributing Guide for details.

Development Setup

git clone https://github.com/dot-css/neurolite.git
cd neurolite
pip install -e ".[dev]"
pre-commit install

Running Tests

pytest tests/ -v

Code Quality

black neurolite/ tests/
flake8 neurolite/ tests/
mypy neurolite/

📄 License

This project is licensed under the MIT License - see the LICENSE file for details.

🙏 Acknowledgments

  • Built with ❤️ by the NeuroLite Team
  • Powered by PyTorch, Transformers, Scikit-learn, and other amazing open-source libraries
  • Special thanks to our contributors and the ML community

📞 Support


Made with ❤️ for the AI/ML community

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

neurolite-0.2.0.tar.gz (349.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

neurolite-0.2.0-py3-none-any.whl (232.3 kB view details)

Uploaded Python 3

File details

Details for the file neurolite-0.2.0.tar.gz.

File metadata

  • Download URL: neurolite-0.2.0.tar.gz
  • Upload date:
  • Size: 349.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.12.4

File hashes

Hashes for neurolite-0.2.0.tar.gz
Algorithm Hash digest
SHA256 68e8672faf999bdaf996601eacb6d95cfd9e9bbf5ed39fa9d1a4c0c0bd4197c8
MD5 0beb4f652f398fb3089c961d74b5f927
BLAKE2b-256 73459ba7df89b21c9543d6ae462a288ad54e17eadc4feddb51dabc6eb19bcb0e

See more details on using hashes here.

File details

Details for the file neurolite-0.2.0-py3-none-any.whl.

File metadata

  • Download URL: neurolite-0.2.0-py3-none-any.whl
  • Upload date:
  • Size: 232.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.12.4

File hashes

Hashes for neurolite-0.2.0-py3-none-any.whl
Algorithm Hash digest
SHA256 0ea6e1836ba312f9f36ae7da96f0fa9631bb5a441562031061bc803988d342ae
MD5 3b5500f86325e1c7962ef2f32d298338
BLAKE2b-256 4cdb20c489ade51772c2bc87658acfcb39b0dcc2fb66493d7d904163def268f3

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page