29

Auptimizer: A faster, easier way to do hyperparameter optimization for machine l...

 4 years ago
source link: https://mc.ai/auptimizer-a-faster-easier-way-to-do-hyperparameter-optimization-for-machine-learning/
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.

Auptimizer: A faster, easier way to do hyperparameter optimization for machine learning

Posted by Jiayi (Jason) Liu, Unmesh Kurup, and Mohak Shah

Auptimizer is a general-purpose open-source Hyperparameter Optimization (HPO) framework that also allows you to scale your HPO training from CPUs and GPUs to on-prem and EC2 instances. To get started, use “pip install auptimizer”. You can find our documentation here and our repo here .

Over the past decade, we have made significant advances in the building and training of machine learning models. We can now optimize very large models by utilizing improvements in algorithms, learning strategies, and the availability of distributed compute and memory. However, as the size and complexity of these models have grown, so too have the number of underlying hyperparameters. But the strategies for dealing with hyperparameter optimization (HPO) have mostly remained limited to the most commonly used grid- and random-search approaches in practice. While there are a few commercial and open-source solutions that target HPO, none has broad applicability across problems and platforms. HPO remains as much art as science and a key bottleneck in training effective machine learning models.

One roadblock is the lack of consensus on the best available HPO algorithm. Developers have to experiment with multiple algorithms to find the one best suited to their problem. However, a lack of portability between the implementations means that the users are often stuck using one particular algorithm having built their tooling around them. Auptimizer makes it easy for researchers and practitioners to switch between different HPO algorithms. In addition, Auptimizer also supports cross-platform scaling enabling users to easily scale their experiments from desktop to on-premise clusters and even the cloud.

For ML researchers, the use case can be different. Researchers focus on developing the algorithms to find the best hyperparameters. Thus, an easy framework to facilitate their algorithm implementation and to benchmark their results against state-of-the-art algorithms is also important.

Briefly, Auptimizer is a scalable, extensible toolkit for conducting HPO. Auptimizer has three advantages:

  1. A common interface to a variety of HPO approaches allowing for easy switching between them;
  2. Easy extensibility allowing users to (a) add their own HPO algorithms and (b) benchmark them against existing solutions; and
  3. Easy deployment to AWS providing a way to scale model training from your personal computer to the cloud.

Since Auptimizer is platform independent, it can work with your framework of choice including TensorFlow, PyTorch, MxNet and Caffe.

Auptimizer’s common interface makes the process of switching between HPO algorithms much easier by abstracting away the differences between their implementations. In this release we support the following HPO techniques — Random Search, Grid Search, Hyperband, Hyperopt, Spearmint, and EAS (experimental).

Auptimizer can easily support integration and benchmarking of newer HPO techniques including your own custom algorithms. As an example of Auptimizer’s ease of extensibility, we integrated Bayesian Optimization and Hyperband (BOHB) in just a couple of days (requiring to write only 138 line of code while reusing the original 4305 lines of codes).

Finally, Auptimizer also includes functionality to automate your HPO process. Auptimizer supports different computing resources, such as CPUs, GPUs, multiple nodes, and AWS EC2 instances. It is also flexible enough to extend to other cloud platforms or your on-premise solution.


About Joyk


Aggregate valuable and interesting links.
Joyk means Joy of geeK