99

GitHub - facebook/Ax: Adaptive Experimentation Platform

 5 years ago
source link: https://github.com/facebook/Ax
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.

README.md

Ax Logo


Build Status Build Status Build Status Build Status Build Status

Ax is an accessible, general-purpose platform for understanding, managing, deploying, and automating adaptive experiments.

Adaptive experimentation is the machine-learning guided process of iteratively exploring a (possibly infinite) parameter space in order to identify optimal configurations in a resource-efficient manner. Ax currently supports Bayesian optimization and bandit optimization as exploration strategies. Bayesian optimization in Ax is powered by BoTorch, a modern library for Bayesian optimization research built on PyTorch.

For full documentation and tutorials, see the Ax website

Why Ax?

  • Versatility: Ax supports different kinds of experiments, from dynamic ML-assisted A/B testing, to hyperparameter optimization in machine learning.
  • Customization: Ax makes it easy to add new modeling and decision algorithms, enabling research and development with minimal overhead.
  • Production-completeness: Ax comes with storage integration and ability to fully save and reload experiments.
  • Support for multi-modal and constrained experimentation: Ax allows for running and combining multiple experiments (e.g. simulation with a real-world "online" A/B test) and for constrained optimization (e.g. improving classification accuracy without signifant increase in resource-utilization).
  • Efficiency in high-noise setting: Ax offers state-of-the-art algorithms specifically geared to noisy experiments, such as simulations with reinforcement-learning agents.
  • Ease of use: Ax includes 3 different APIs that strike different balances between lightweight structure and flexibility. Using the most concise Loop API, a whole optimization can be done in just one function call. The Service API integrates easily with external schedulers. The most elaborate Developer API affords full algorithm customization and experiment introspection.

Getting Started

To run a simple optimization loop in Ax (using the Booth response surface as the artificial evaluation function):

>>> from ax import optimize
>>> best_parameters, best_values, experiment, model = optimize(
        parameters=[
          {
            "name": "x1",
            "type": "range",
            "bounds": [-10.0, 10.0],
          },
          {
            "name": "x2",
            "type": "range",
            "bounds": [-10.0, 10.0],
          },
        ],
        # Booth function
        evaluation_function=lambda p: p["x1"] + 2*p["x2"] - 7)**2 + (2*p["x1"] + p["x2"] - 5,
        minimize=True,
    )

# best_parameters contains {'x1': 1.02, 'x2': 2.97}; the global min is (1, 3)

Installation

Requirements

You need Python 3.6 or later to run Ax.

The required Python dependencies are:

  • botorch
  • jinja2
  • pandas
  • scipy
  • simplejson
  • sklearn
  • plotly >=2.2.1, <3.0

Installation via pip

We recommend installing Ax via pip. To do so, run:

pip3 install ax-platform

Recommendation for MacOS users: PyTorch is a required dependency of BoTorch, and can be automatically installed via pip. However, we recommend you install PyTorch manually before installing Ax, using the Anaconda package manager. Installing from Anaconda will link against MKL (a library that optimizes mathematical computation for Intel processors). This will result in up to an order-of-magnitude speed-up for Bayesian optimization, as at the moment, installing PyTorch from pip does not link against MKL.

Installing from source

To install from source:

  1. Make sure you have installed the botorch dependency.
  2. Download Ax from the Git repository.
  3. cd into the ax project and run:
pip3 install -e .

Note: When installing from source, Ax requires a compiler for Cython code.

Optional Dependencies

Depending on your intended use of Ax, you may want to install Ax with optional dependencies.

If using Ax in Jupyter notebooks:

pip3 install git+ssh://[email protected]/facebook/Ax.git#egg=Ax[notebook]

If storing Ax experiments via SQLAlchemy in MySQL or SQLite:

pip3 install git+ssh://[email protected]/facebook/Ax.git#egg=Ax[mysql]

Note that instead of installation from Git, you can also clone a local version of the repo and then pip install with desired flags from the root of the local repo, e.g.:

pip3 install -e .[mysql]

Join the Ax community

See the CONTRIBUTING file for how to help out. You will also need to install the dependencies needed for development, which are listed in DEV_REQUIRES in setup.py, as follows:

pip3 install git+ssh://[email protected]/facebook/Ax.git#egg=Ax[dev]

License

Ax is licensed under the MIT license.


About Joyk


Aggregate valuable and interesting links.
Joyk means Joy of geeK