Neural Pipeline Search helps deep learning experts find the best neural pipeline.
Project description
Neural Pipeline Search (NePS)
Welcome to NePS, a powerful and flexible Python library for hyperparameter optimization (HPO) and neural architecture search (NAS) with its primary goal: enable HPO adoption in practice for deep learners!
NePS houses recently published and some more well-established algorithms that are all capable of being run massively parallel on any distributed setup, with tools to analyze runs, restart runs, etc.
Take a look at our documentation and continue following through current README for instructions on how to use NePS!
Key Features
In addition to the common features offered by traditional HPO and NAS libraries, NePS stands out with the following key features:
-
Hyperparameter Optimization (HPO) With Prior Knowledge:
- NePS excels in efficiently tuning hyperparameters using algorithms that enable users to make use of their prior knowledge within the search space. This is leveraged by the insights presented in:
-
Neural Architecture Search (NAS) With Context-free Grammar Search Spaces:
- NePS is equipped to handle context-free grammar search spaces, providing advanced capabilities for designing and optimizing architectures. this is leveraged by the insights presented in:
-
Easy Parallelization and Resumption of Runs:
- NePS simplifies the process of parallelizing optimization tasks both on individual computers and in distributed computing environments. It also allows users to conveniently resume these optimization tasks after completion to ensure a seamless and efficient workflow for long-running experiments.
-
Seamless User Code Integration:
- NePS's modular design ensures flexibility and extensibility. Integrate NePS effortlessly into existing machine learning workflows.
Getting Started
1. Installation
NePS requires Python 3.8 or higher. You can install it via pip or from source.
Using pip:
pip install neural-pipeline-search
Note: As indicated with the
v0.x.x
version number, NePS is early stage code and APIs might change in the future.
You can install from source by cloning the repository and running:
git clone git@github.com:automl/neps.git
cd neps
poetry install
2. Basic Usage
Using neps
always follows the same pattern:
- Define a
run_pipeline
function capable of evaluating different architectural and/or hyperparameter configurations for your problem. - Define a search space named
pipeline_space
of those Parameters e.g. via a dictionary - Call
neps.run
to optimizerun_pipeline
overpipeline_space
In code, the usage pattern can look like this:
import neps
import logging
# 1. Define a function that accepts hyperparameters and computes the validation error
def run_pipeline(
hyperparameter_a: float, hyperparameter_b: int, architecture_parameter: str
) -> dict:
# Create your model
model = MyModel(architecture_parameter)
# Train and evaluate the model with your training pipeline
validation_error, training_error = train_and_eval(
model, hyperparameter_a, hyperparameter_b
)
return { # dict or float(validation error)
"loss": validation_error,
"info_dict": {
"training_error": training_error
# + Other metrics
},
}
# 2. Define a search space of parameters; use the same names for the parameters as in run_pipeline
pipeline_space = dict(
hyperparameter_b=neps.IntegerParameter(
lower=1, upper=42, is_fidelity=True
), # Mark 'is_fidelity' as true for a multi-fidelity approach.
hyperparameter_a=neps.FloatParameter(
lower=0.001, upper=0.1, log=True
), # If True, the search space is sampled in log space.
architecture_parameter=neps.CategoricalParameter(
["option_a", "option_b", "option_c"]
),
)
if __name__ == "__main__":
# 3. Run the NePS optimization
logging.basicConfig(level=logging.INFO)
neps.run(
run_pipeline=run_pipeline,
pipeline_space=pipeline_space,
root_directory="path/to/save/results", # Replace with the actual path.
max_evaluations_total=100,
searcher="hyperband" # Optional specifies the search strategy,
# otherwise NePs decides based on your data.
)
Examples
Discover how NePS works through these practical examples:
-
Pipeline Space via YAML: Explore how to define the
pipeline_space
using a YAML file instead of a dictionary. -
Hyperparameter Optimization (HPO): Learn the essentials of hyperparameter optimization with NePS.
-
Architecture Search with Primitives: Dive into architecture search using primitives in NePS.
-
Multi-Fidelity Optimization: Understand how to leverage multi-fidelity optimization for efficient model tuning.
-
Utilizing Expert Priors for Hyperparameters: Learn how to incorporate expert priors for more efficient hyperparameter selection.
-
Additional NePS Examples: Explore more examples, including various use cases and advanced configurations in NePS.
Documentation
For more details and features please have a look at our documentation
Analysing runs
See our documentation on analysing runs.
Contributing
Please see the documentation for contributors.
Citations
Please consider citing us if you use our tool!
Refer to our documentation on citations.
Alternatives
NePS does not cover your use-case? Have a look at some alternatives.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for neural_pipeline_search-0.12.0.tar.gz
Algorithm | Hash digest | |
---|---|---|
SHA256 | d44e7119bb62992a20edb6142fda93a3a97ed3e7c24a02539e6922ff615f1dc8 |
|
MD5 | d65aeb582e249a3a306099c3f61ee082 |
|
BLAKE2b-256 | 1714581a6724c04db9e46f4971f4b9fa6c48e9dc7c6f55d273db891f88e4c2ed |
Hashes for neural_pipeline_search-0.12.0-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | bc51eb84c3f75a1d213028301e990e2b0aa1431140e4eeef7d6675c2a31d47cd |
|
MD5 | 3b363db2d9bb3a9fa1151cd2bc9edf1f |
|
BLAKE2b-256 | 7b803e11fa2171ca05403bc89f5df275069b717349cf71ca008eca052d4c8c1b |