NUAIM - Black-Box Generator for Meta-Variable Optimization.
Project description
NUAIM - Black-Box Generator for Meta-Variable Optimization
NUAIM is a Python package designed for Black-Box Optimization (BBO). It acts as a black-box generator for meta-variable (or hierarchical variable) optimization, a feature that is rarely available in black-box generators. With NUAIM, you can explore and optimize complex neural network architectures and hyperparameters in a structured and extensible way.
✨ Features
- 🧠 Black-Box Generator: Generate and evaluate neural network architectures as black-box functions for optimization.
- 📊 Meta-Variable Optimization: Supports hierarchical variables, enabling optimization of both architecture and hyperparameters.
- 🔧 Flexible Hyperparameter Management: Easily customize optimizers, learning rates, and other parameters.
- 📈 Detailed Performance Metrics: Logs training time, forward pass time, and accuracy for each model.
- 🎯 Preset Configurations: Ready-to-use presets for MLPs, CNNs, Transformers, and hybrid architectures.
- 🔄 Modular Design: Clean and extensible architecture for research and experimentation.
- 📝 Rich Logging: Save results in table and JSON formats for easy analysis.
- 🚀 Easy Integration: Simple Python API for seamless integration into your workflows.
🚀 Quick Start
Installation
Install NUAIM using pip:
pip install nuaim
Basic Usage
What is Possible with sample_instance
The sample_instance method allows you to:
- Train multiple models with different configurations
- Log results in various formats (table, JSON, or both)
- Track training and forward pass times
Optional parameters for sample_instance:
dataset_frac: Fraction of the dataset to use (default: 1)num_samples: Number of models to sample (default: 5)max_epochs: Maximum number of training epochs (default: 1000)filename: Name of the file to save results (default: None)log_dir: Directory to save logs (default: "logs")verbose: Whether to print progress (default: True)output_format: Format of the output ("table", "json", or "both", default: "both")track_train_time: Track training time (default: True)track_forward_time: Track forward pass time (default: True)custom_params: Custom hyperparameters (default: None)
Simple Implementation of sample_instance
from nuaim import ModelFactory
# Create a model factory with a preset configuration
factory = ModelFactory("BasicMLP")
# Train multiple models and get results
results = factory.sample_instance(
num_samples=10,
max_epochs=20,
filename="experiment_results",
log_dir="./logs",
output_format="both", # "table", "json", or "both"
track_train_time=True,
track_forward_time=True
)
print(f"Best accuracy: {results['best_accuracy']:.4f}")
print(f"Average accuracy: {results['average_accuracy']:.4f}")
What is Possible with factory.train
The factory.train method allows you to:
- Train a single model with specific hyperparameters
- Customize training parameters like epochs and verbosity
- TBD -> Append results to log file
Optional parameters for train:
hyper_params: Dictionary of hyperparameters for the model (default: None)max_epochs: Maximum number of training epochs (default: 1000)verbose: Whether to print progress in (default: True)
Simple Implementation of factory.train
# Generate random hyperparameters for a model
hyperparams = factory.generate_random_model_dict()
# Train the model and get accuracy
accuracy = factory.train(
hyper_params=hyperparams,
max_epochs=10,
verbose=True
)
print(f"Model accuracy: {accuracy:.4f}")
📖 Available Presets
NUAIM includes several built-in preset configurations for different levels of complexity:
Basic Presets
- BasicMLP: Simple multi-layer perceptron configurations
- BasicCNN: Basic convolutional neural networks
- BasicTransformer: Simple transformer architectures
Advanced Presets
- AdvancedMLP: Complex MLP with advanced optimization
- AdvancedCNN: Advanced CNN architectures
- AdvancedTransformer: Multi-head attention mechanisms
Hard Presets
- HardMLP: Complex MLP with extensive hyperparameter spaces
- HardCNN-MLP: Hybrid CNN-MLP architectures
- HardTransformer: Advanced transformer configurations
Complete Preset
- Complete: Comprehensive search space with all layer types.
🔬 Research Applications
NUAIM is ideal for:
- Black-Box Optimization (BBO): Treat neural network architectures as black-box functions for optimization.
- Meta-Variable Optimization: Optimize hierarchical variables like architecture and hyperparameters.
- Neural Architecture Search (NAS): Automate the discovery of optimal architectures.
- Hyperparameter Tuning: Systematically explore hyperparameter spaces.
- Model Benchmarking: Compare different architectures and configurations.
🤝 Contributing
We welcome contributions! Please feel free to:
- Fork the repository
- Create a feature branch
- Make your changes
- Submit a pull request
📄 License
This project is licensed under the MIT License.
Made with ❤️ for the research community
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file nuaim-1.0.0.tar.gz.
File metadata
- Download URL: nuaim-1.0.0.tar.gz
- Upload date:
- Size: 23.7 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.9.21
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
50da3434acb632ece365ca292a6cb56e90e2600465580b60a4ff1115d138b132
|
|
| MD5 |
932d5709c12fef4ee3e0e58025fefdc9
|
|
| BLAKE2b-256 |
cbe98b6445bcc32ecbcaa1ae4359f97bca32045064fdf89a3bd1a332bc928957
|
File details
Details for the file nuaim-1.0.0-py3-none-any.whl.
File metadata
- Download URL: nuaim-1.0.0-py3-none-any.whl
- Upload date:
- Size: 25.5 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.9.21
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
fac59c8348042b738cb3631764e42c3d1a444ff8258409e4ffecc532c72ec8c9
|
|
| MD5 |
310243c5e5d66a339d120e8c66f91f85
|
|
| BLAKE2b-256 |
9a3b02d130a9ec838a9d6f6a5089599fe58cce27af3b694ac7063106ec3e1d85
|