Skip to main content

Adaptive Experimentation

Project description

Ax Logo

Build Status Build Status Build Status Build Status codecov Build Status

Ax is an accessible, general-purpose platform for understanding, managing, deploying, and automating adaptive experiments.

Adaptive experimentation is the machine-learning guided process of iteratively exploring a (possibly infinite) parameter space in order to identify optimal configurations in a resource-efficient manner. Ax currently supports Bayesian optimization and bandit optimization as exploration strategies. Bayesian optimization in Ax is powered by BoTorch, a modern library for Bayesian optimization research built on PyTorch.

For full documentation and tutorials, see the Ax website

Why Ax?

  • Versatility: Ax supports different kinds of experiments, from dynamic ML-assisted A/B testing, to hyperparameter optimization in machine learning.
  • Customization: Ax makes it easy to add new modeling and decision algorithms, enabling research and development with minimal overhead.
  • Production-completeness: Ax comes with storage integration and ability to fully save and reload experiments.
  • Support for multi-modal and constrained experimentation: Ax allows for running and combining multiple experiments (e.g. simulation with a real-world "online" A/B test) and for constrained optimization (e.g. improving classification accuracy without signifant increase in resource-utilization).
  • Efficiency in high-noise setting: Ax offers state-of-the-art algorithms specifically geared to noisy experiments, such as simulations with reinforcement-learning agents.
  • Ease of use: Ax includes 3 different APIs that strike different balances between lightweight structure and flexibility. Using the most concise Loop API, a whole optimization can be done in just one function call. The Service API integrates easily with external schedulers. The most elaborate Developer API affords full algorithm customization and experiment introspection.

Getting Started

To run a simple optimization loop in Ax (using the Booth response surface as the artificial evaluation function):

>>> from ax import optimize
>>> best_parameters, best_values, experiment, model = optimize(
        parameters=[
          {
            "name": "x1",
            "type": "range",
            "bounds": [-10.0, 10.0],
          },
          {
            "name": "x2",
            "type": "range",
            "bounds": [-10.0, 10.0],
          },
        ],
        # Booth function
        evaluation_function=lambda p: (p["x1"] + 2*p["x2"] - 7)**2 + (2*p["x1"] + p["x2"] - 5)**2,
        minimize=True,
    )

# best_parameters contains {'x1': 1.02, 'x2': 2.97}; the global min is (1, 3)

Installation

Requirements

You need Python 3.6 or later to run Ax.

The required Python dependencies are:

  • botorch
  • jinja2
  • pandas
  • scipy
  • simplejson
  • sklearn
  • plotly

Installation via pip

We recommend installing Ax via pip. To do so, run:

conda install pytorch torchvision -c pytorch  # OSX only
pip3 install ax-platform  # all systems

Recommendation for MacOS users: PyTorch is a required dependency of BoTorch, and can be automatically installed via pip. However, we recommend you install PyTorch manually before installing Ax, using the Anaconda package manager. Installing from Anaconda will link against MKL (a library that optimizes mathematical computation for Intel processors). This will result in up to an order-of-magnitude speed-up for Bayesian optimization, as at the moment, installing PyTorch from pip does not link against MKL. Currently, installation through Anaconda is temporarily required for OSX, as the pip installation of PyTorch is broken.

Installing from source

To install from source:

  1. Make sure you have installed the botorch dependency.
  2. Download Ax from the Git repository.
  3. cd into the ax project and run:
pip3 install -e .

Note: When installing from source, Ax requires a compiler for Cython code.

Optional Dependencies

Depending on your intended use of Ax, you may want to install Ax with optional dependencies.

If using Ax in Jupyter notebooks:

pip3 install git+ssh://git@github.com/facebook/Ax.git#egg=Ax[notebook]

If storing Ax experiments via SQLAlchemy in MySQL or SQLite:

pip3 install git+ssh://git@github.com/facebook/Ax.git#egg=Ax[mysql]

Note that instead of installation from Git, you can also clone a local version of the repo and then pip install with desired flags from the root of the local repo, e.g.:

pip3 install -e .[mysql]

Join the Ax community

See the CONTRIBUTING file for how to help out. You will also need to install the dependencies needed for development, which are listed in DEV_REQUIRES in setup.py, as follows:

pip3 install git+ssh://git@github.com/facebook/Ax.git#egg=Ax[dev]

License

Ax is licensed under the MIT license.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distributions

ax_platform-0.1.2-cp37-cp37m-manylinux1_x86_64.whl (985.5 kB view hashes)

Uploaded CPython 3.7m

ax_platform-0.1.2-cp37-cp37m-macosx_10_9_x86_64.whl (569.5 kB view hashes)

Uploaded CPython 3.7m macOS 10.9+ x86-64

ax_platform-0.1.2-cp37-cp37m-macosx_10_6_intel.whl (776.3 kB view hashes)

Uploaded CPython 3.7m macOS 10.6+ intel

ax_platform-0.1.2-cp36-cp36m-manylinux1_x86_64.whl (987.0 kB view hashes)

Uploaded CPython 3.6m

ax_platform-0.1.2-cp36-cp36m-macosx_10_7_x86_64.whl (575.5 kB view hashes)

Uploaded CPython 3.6m macOS 10.7+ x86-64

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page