Skip to main content

Neural Pipeline Search helps deep learning experts find the best neural pipeline.

Project description

Neural Pipeline Search (NePS)

PyPI version Python versions License Tests

Welcome to NePS, a powerful and flexible Python library for hyperparameter optimization (HPO) and neural architecture search (NAS) that makes HPO and NAS practical for deep learners.

NePS houses recently published and also well-established algorithms that can all be run massively parallel on distributed setups and, in general, NePS is tailored to the needs of deep learning experts.

To learn about NePS, check-out the documentation, our examples, or a colab tutorial.

Key Features

In addition to the features offered by traditional HPO and NAS libraries, NePS stands out with:

  1. Hyperparameter Optimization (HPO) Efficient Enough for Deep Learning:
    NePS excels in efficiently tuning hyperparameters using algorithms that enable users to make use of their prior knowledge, while also using many other efficiency boosters.
  2. Neural Architecture Search (NAS) with Expressive Search Spaces:
    NePS provides capabilities for optimizing DL architectures in an expressive and natural fashion.
  3. Zero-effort Parallelization and an Experience Tailored to DL:
    NePS simplifies the process of parallelizing optimization tasks both on individual computers and in distributed computing environments. As NePS is made for deep learners, all technical choices are made with DL in mind and common DL tools such as Tensorboard are embraced.

Installation

To install the latest release from PyPI run

pip install neural-pipeline-search

Basic Usage

Using neps always follows the same pattern:

  1. Define a evaluate_pipeline function capable of evaluating different architectural and/or hyperparameter configurations for your problem.
  2. Define a pipeline_space of those Parameters
  3. Call neps.run(evaluate_pipeline, pipeline_space)

In code, the usage pattern can look like this:

import neps
import logging

logging.basicConfig(level=logging.INFO)

# 1. Define a function that accepts hyperparameters and computes the validation error
def evaluate_pipeline(lr: float, alpha: int, optimizer: str):
    # Create your model
    model = MyModel(lr=lr, alpha=alpha, optimizer=optimizer)

    # Train and evaluate the model with your training pipeline
    validation_error = train_and_eval(model)
    return validation_error


# 2. Define a search space of parameters; use the same parameter names as in evaluate_pipeline
class ExampleSpace(neps.PipelineSpace):
    lr = neps.Float(
        lower=1e-5,
        upper=1e-1,
        log=True,   # Log spaces
        prior=1e-3, # Incorporate your knowledge to help optimization
    )
    alpha = neps.Integer(lower=1, upper=42)
    optimizer = neps.Categorical(choices=["sgd", "adam"])

# 3. Run the NePS optimization
neps.run(
    evaluate_pipeline=evaluate_pipeline,
    pipeline_space=ExampleSpace(),
    root_directory="path/to/save/results",  # Replace with the actual path.
    evaluations_to_spend=100,
)

Examples

Discover how NePS works through these examples:

Contributing

Please see the documentation for contributors.

Citations

For pointers on citing the NePS package and papers refer to our documentation on citations.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

neural_pipeline_search-0.15.0.tar.gz (210.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

neural_pipeline_search-0.15.0-py3-none-any.whl (235.8 kB view details)

Uploaded Python 3

File details

Details for the file neural_pipeline_search-0.15.0.tar.gz.

File metadata

File hashes

Hashes for neural_pipeline_search-0.15.0.tar.gz
Algorithm Hash digest
SHA256 68aef07f1037d6ce0f5f4da29e8e761d6f1dcc1da1c31fcc421000a246dfc6b6
MD5 08d52639720cc0fe6166c6df5c4275d8
BLAKE2b-256 ba98edaf95b86ed60bb764c850288957dffccb8ed197e552366f1e004e900bcb

See more details on using hashes here.

File details

Details for the file neural_pipeline_search-0.15.0-py3-none-any.whl.

File metadata

File hashes

Hashes for neural_pipeline_search-0.15.0-py3-none-any.whl
Algorithm Hash digest
SHA256 73bf2e667769bcca1a61f736fd70e582e93f2d2669c7535aa23ca56f08af908f
MD5 0e494d27f458bdfb41d15ac10030b7be
BLAKE2b-256 c48db702647dbec13a95237366377cceef53d248dab155d4c0b681b08e07245a

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page