Skip to main content

MLOS Core Python interface for parameter optimization.

Project description

mlos-core

This directory contains the code for the mlos-core optimizer package.

It's available for pip install via the pypi repository at mlos-core.

Description

mlos-core is an optimizer package, wrapping other libraries like FLAML and SMAC to use techniques like Bayesian optimization and others to identify & sample tunable configuration parameters and propose optimal parameter values with a consistent API: suggest and register.

These can be evaluated by mlos-bench, generating and tracking experiment results (proposed parameters, benchmark results & telemetry) to update the optimization loop, or used independently.

Features

Since the tunable parameter search space is often extremely large, mlos-core automates the following steps to efficiently generate optimal task-specific kernel and application configurations.

  1. Reduce the search space by identifying a promising set of tunable parameters
    • Map out the configuration search space: Automatically track and manage the discovery of new Linux kernel parameters and their default values across versions. Filter out non-tunable parameters (e.g., not writable) and track which kernel parameters exist for a given kernel version.
    • Leverage parameter knowledge for optimization: Information on ranges, sampling intervals, parameter correlations, workload type sensitivities for tunable parameters are tracked and currently manually curated. In the future, this can be automatically maintained by scraping documentation pages on kernel parameters.
    • Tailored to application: Consider prior knowledge of the parameter's impact & an application's workload profile (e.g. network heavy, disk heavy, CPU bound, multi-threaded, latency sensitive, throughput oriented, etc.) to identify likely impactful candidates of tunable parameters, specific to a particular application.
  2. Sampling to warm-start optimization in a high dimensional search space
  3. Produce optimal configurations through Bayesian optimization
    • Support for various optimizer algorithms (default Bayesian optimizer, Flaml, SMAC, and random for baseline comparison), that handle multiple types of constraints. This includes cost-aware optimization, that considers experiment costs given current tunable parameters.
    • Integrated with mlos-bench, proposed configurations are logged and evaluated.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

mlos-core-0.3.2.tar.gz (119.0 kB view details)

Uploaded Source

Built Distribution

mlos_core-0.3.2-py3-none-any.whl (26.6 kB view details)

Uploaded Python 3

File details

Details for the file mlos-core-0.3.2.tar.gz.

File metadata

  • Download URL: mlos-core-0.3.2.tar.gz
  • Upload date:
  • Size: 119.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.11.7

File hashes

Hashes for mlos-core-0.3.2.tar.gz
Algorithm Hash digest
SHA256 5c38c537b8a08ee9ef851a0f5cc1eb149defefb6c417a00061b5c6faf03d9600
MD5 55b52ac8cfd5b0728093a124aeb74a4e
BLAKE2b-256 0891e894754a8468e6f8bd8dc706f2bfe7354f7b1627667b25ca76d8b8bec7c4

See more details on using hashes here.

Provenance

File details

Details for the file mlos_core-0.3.2-py3-none-any.whl.

File metadata

  • Download URL: mlos_core-0.3.2-py3-none-any.whl
  • Upload date:
  • Size: 26.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.11.7

File hashes

Hashes for mlos_core-0.3.2-py3-none-any.whl
Algorithm Hash digest
SHA256 1793e824df77b8492a22feeee21d9c603791dcc6eb689ce6b6e469414c4ae3dc
MD5 4e2d13625485bc77b246ddcd12dd7550
BLAKE2b-256 5d91b52f3df29640828d57193a6d55e330dae2447fbac53a881a86f03772d9ca

See more details on using hashes here.

Provenance

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page