Skip to main content

Neural Pipeline Search helps deep learning experts find the best neural pipeline.

Project description

Neural Pipeline Search (NePS)

PyPI version Python versions License Tests

Welcome to NePS, a powerful and flexible Python library for hyperparameter optimization (HPO) and neural architecture search (NAS) that makes HPO and NAS practical for deep learners.

NePS houses recently published and also well-established algorithms that can all be run massively parallel on distributed setups and, in general, NePS is tailored to the needs of deep learning experts.

To learn about NePS, check-out the documentation, our examples, or a colab tutorial.

Key Features

In addition to the features offered by traditional HPO and NAS libraries, NePS stands out with:

  1. Hyperparameter Optimization (HPO) Efficient Enough for Deep Learning:
    NePS excels in efficiently tuning hyperparameters using algorithms that enable users to make use of their prior knowledge, while also using many other efficiency boosters.
  2. Neural Architecture Search (NAS) with Expressive Search Spaces:
    NePS provides capabilities for optimizing DL architectures in an expressive and natural fashion.
  3. Zero-effort Parallelization and an Experience Tailored to DL:
    NePS simplifies the process of parallelizing optimization tasks both on individual computers and in distributed computing environments. As NePS is made for deep learners, all technical choices are made with DL in mind and common DL tools such as Tensorboard are embraced.

Installation

To install the latest release from PyPI run

pip install neural-pipeline-search

Basic Usage

Using neps always follows the same pattern:

  1. Define a evaluate_pipeline function capable of evaluating different architectural and/or hyperparameter configurations for your problem.
  2. Define a pipeline_space of those Parameters
  3. Call neps.run(evaluate_pipeline, pipeline_space)

In code, the usage pattern can look like this:

import neps
import logging

logging.basicConfig(level=logging.INFO)

# 1. Define a function that accepts hyperparameters and computes the validation error
def evaluate_pipeline(lr: float, alpha: int, optimizer: str):
    # Create your model
    model = MyModel(lr=lr, alpha=alpha, optimizer=optimizer)

    # Train and evaluate the model with your training pipeline
    validation_error = train_and_eval(model)
    return validation_error


# 2. Define a search space of parameters; use the same parameter names as in evaluate_pipeline
class ExampleSpace(neps.PipelineSpace):
    lr = neps.Float(
        lower=1e-5,
        upper=1e-1,
        log=True,   # Log spaces
        log_base=10, # Logarithm base, by default it's natural log
        prior=1e-3, # Incorporate your knowledge to help optimization
    )
    alpha = neps.Integer(lower=1, upper=42)
    optimizer = neps.Categorical(choices=["sgd", "adam"])

# 3. Run the NePS optimization
neps.run(
    evaluate_pipeline=evaluate_pipeline,
    pipeline_space=ExampleSpace(),
    root_directory="path/to/save/results",  # Replace with the actual path.
    evaluations_to_spend=100,
)

Examples

Discover how NePS works through these examples:

Contributing

Please see the documentation for contributors.

Citations

For pointers on citing the NePS package and papers refer to our documentation on citations.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

neural_pipeline_search-0.16.0.tar.gz (217.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

neural_pipeline_search-0.16.0-py3-none-any.whl (243.6 kB view details)

Uploaded Python 3

File details

Details for the file neural_pipeline_search-0.16.0.tar.gz.

File metadata

File hashes

Hashes for neural_pipeline_search-0.16.0.tar.gz
Algorithm Hash digest
SHA256 0581de85593ed56da3eb586efdd7929234f7d4859b8a1cd6485b611d2484dd02
MD5 60e8d428b2aa0142759a47c703be5c6e
BLAKE2b-256 7f9f9d3003644f5576007258d18f3a273a9d49f4dfbf6ad107d4a1727105e53a

See more details on using hashes here.

File details

Details for the file neural_pipeline_search-0.16.0-py3-none-any.whl.

File metadata

File hashes

Hashes for neural_pipeline_search-0.16.0-py3-none-any.whl
Algorithm Hash digest
SHA256 299f6357291c3cdd164ab4b51dff1e285eb88e91c0d2a7e1b107312be3e08e55
MD5 8f9f5df184ad0b9ffe8eb088b49f972a
BLAKE2b-256 edd49e6eee5713e6af0a432c7000911f02af4169625a68af2775d524fc651f43

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page